[MINI] Dropout
Data Skeptic13 Jan 2017

[MINI] Dropout

Deep learning can be prone to overfit a given problem. This is especially frustrating given how much time and computational resources are often required to converge. One technique for fighting overfitting is to use dropout. Dropout is the method of randomly selecting some neurons in one's network to set to zero during iterations of learning. The core idea is that each particular input in a given layer is not always available and therefore not a signal that can be relied on too heavily.

Denne episoden er hentet fra en åpen RSS-feed og er ikke publisert av Podme. Den kan derfor inneholde annonser.

Episoder(601)

Populært innen Vitenskap

fastlegen
tingenes-tilstand
jss
forskningno
rss-zahid-ali-hjelper-deg
rekommandert
rss-paradigmepodden
sinnsyn
liberal-halvtime
vett-og-vitenskap-med-gaute-einevoll
rss-overskuddsliv
kvinnehelsepodden
nordnorsk-historie
tidlose-historier
villmarksliv
grunnstoffene
rss-inn-til-kjernen-med-sunniva-rose
nevropodden
noen-har-snakket-sammen
fjellsportpodden