Language Modeling With State Space Models with Dan Fu - #630

Language Modeling With State Space Models with Dan Fu - #630

Today we’re joined by Dan Fu, a PhD student at Stanford University. In our conversation with Dan, we discuss the limitations of state space models in language modeling and the search for alternative building blocks that can help increase context length without being computationally infeasible. Dan walks us through the H3 architecture and Flash Attention technique, which can reduce the memory footprint of a model and make it feasible to fine-tune. We also explore his work on improving language models using synthetic languages, the issue of long sequence length affecting both training and inference in models, and the hope for finding something sub-quadratic that can perform language processing more effectively than the brute force approach of attention. The complete show notes for this episode can be found at https://twimlai.com/go/630

Populärt inom Politik & nyheter

svenska-fall
p3-krim
rss-krimstad
rss-viva-fotboll
fordomspodden
flashback-forever
rss-vad-fan-hande
aftonbladet-daily
rss-sanning-konsekvens
olyckan-inifran
dagens-eko
rss-frandfors-horna
krimmagasinet
rss-krimreportrarna
motiv
svd-dokumentara-berattelser-2
svd-nyhetsartiklar
rss-expressen-dok
blenda-2
spotlight