OLMo: Everything You Need to Train an Open Source LLM with Akshita Bhagia - #674

OLMo: Everything You Need to Train an Open Source LLM with Akshita Bhagia - #674

Today we’re joined by Akshita Bhagia, a senior research engineer at the Allen Institute for AI. Akshita joins us to discuss OLMo, a new open source language model with 7 billion and 1 billion variants, but with a key difference compared to similar models offered by Meta, Mistral, and others. Namely, the fact that AI2 has also published the dataset and key tools used to train the model. In our chat with Akshita, we dig into the OLMo models and the various projects falling under the OLMo umbrella, including Dolma, an open three-trillion-token corpus for language model pretraining, and Paloma, a benchmark and tooling for evaluating language model performance across a variety of domains. The complete show notes for this episode can be found at twimlai.com/go/674.

Populært innen Politikk og nyheter

giver-og-gjengen-vg
aftenpodden
aftenpodden-usa
forklart
stopp-verden
fotballpodden-2
popradet
nokon-ma-ga
det-store-bildet
dine-penger-pengeradet
hanna-de-heldige
e24-podden
rss-ness
frokostshowet-pa-p5
lydartikler-fra-aftenposten
rss-gukild-johaug
rss-penger-polser-og-politikk
unitedno
aftenbla-bla
rss-dannet-uten-piano