[MINI] The Vanishing Gradient
Data Skeptic30 Juni 2017

[MINI] The Vanishing Gradient

This episode discusses the vanishing gradient - a problem that arises when training deep neural networks in which nearly all the gradients are very close to zero by the time back-propagation has reached the first hidden layer. This makes learning virtually impossible without some clever trick or improved methodology to help earlier layers begin to learn.

Populärt inom Vetenskap

p3-dystopia
dumma-manniskor
paranormalt-med-caroline-giertz
svd-nyhetsartiklar
allt-du-velat-veta
rss-vetenskapligt-talat
kapitalet-en-podd-om-ekonomi
dumforklarat
rss-vetenskapspodden
rss-ufobortom-rimligt-tvivel
sexet
rss-vetenskapsradion
rss-i-hjarnan-pa-louise-epstein
rss-vetenskapsradion-2
det-morka-psyket
medicinvetarna
a-kursen
hacka-livet
rss-spraket
rss-personlighetspodden