[MINI] Bias Variance Tradeoff
Data Skeptic13 Nov 2015

[MINI] Bias Variance Tradeoff

A discussion of the expected number of cars at a stoplight frames today's discussion of the bias variance tradeoff. The central ideal of this concept relates to model complexity. A very simple model will likely generalize well from training to testing data, but will have a very high variance since it's simplicity can prevent it from capturing the relationship between the covariates and the output. As a model grows more and more complex, it may capture more of the underlying data but the risk that it overfits the training data and therefore does not generalize (is biased) increases. The tradeoff between minimizing variance and minimizing bias is an ongoing challenge for data scientists, and an important discussion for skeptics around how much we should trust models.

Denne episoden er hentet fra en åpen RSS-feed og er ikke publisert av Podme. Den kan derfor inneholde annonser.

Episoder(601)

Populært innen Vitenskap

fastlegen
tingenes-tilstand
jss
forskningno
rss-zahid-ali-hjelper-deg
rekommandert
rss-paradigmepodden
sinnsyn
liberal-halvtime
vett-og-vitenskap-med-gaute-einevoll
rss-overskuddsliv
kvinnehelsepodden
nordnorsk-historie
tidlose-historier
villmarksliv
grunnstoffene
rss-inn-til-kjernen-med-sunniva-rose
nevropodden
noen-har-snakket-sammen
fjellsportpodden