[MINI] Activation Functions
Data Skeptic16 Jun 2017

[MINI] Activation Functions

In a neural network, the output value of a neuron is almost always transformed in some way using a function. A trivial choice would be a linear transformation which can only scale the data. However, other transformations, like a step function allow for non-linear properties to be introduced.

Activation functions can also help to standardize your data between layers. Some functions such as the sigmoid have the effect of "focusing" the area of interest on data. Extreme values are placed close together, while values near it's point of inflection change more quickly with respect to small changes in the input. Similarly, these functions can take any real number and map all of them to a finite range such as [0, 1] which can have many advantages for downstream calculation.

In this episode, we overview the concept and discuss a few reasons why you might select one function verse another.

Episoder(588)

Populært innen Vitenskap

fastlegen
rekommandert
jss
rss-rekommandert
tingenes-tilstand
sinnsyn
rss-nysgjerrige-norge
dekodet-2
villmarksliv
forskningno
doktor-fives-podcast
rss-paradigmepodden
vett-og-vitenskap-med-gaute-einevoll
tomprat-med-gunnar-tjomlid
psykopoden
abid-nadia-skyld-og-skam
tidlose-historier
pod-britannia
katastrofe-i-hjernen
diagnose