[MINI] Dropout
Data Skeptic13 Jan 2017

[MINI] Dropout

Deep learning can be prone to overfit a given problem. This is especially frustrating given how much time and computational resources are often required to converge. One technique for fighting overfitting is to use dropout. Dropout is the method of randomly selecting some neurons in one's network to set to zero during iterations of learning. The core idea is that each particular input in a given layer is not always available and therefore not a signal that can be relied on too heavily.

Det här avsnittet är hämtat från ett öppet RSS-flöde och publiceras inte av Podme. Det kan innehålla reklam.

Avsnitt(601)

Populärt inom Vetenskap

allt-du-velat-veta
p3-dystopia
dumma-manniskor
rss-ufobortom-rimligt-tvivel
ufo-sverige
kapitalet-en-podd-om-ekonomi
svd-nyhetsartiklar
hacka-livet
sexet
paranormalt-med-caroline-giertz
rss-vetenskapsradion
det-morka-psyket
rss-vetenskapsradion-2
ufo-sverige-2
rss-spraket
medicinvetarna
dumforklarat
halsorevolutionen
rss-dennis-world
rss-klotet