LSTMs, Plus a Deep Learning History Lesson with Jürgen Schmidhuber - TWiML Talk #44

LSTMs, Plus a Deep Learning History Lesson with Jürgen Schmidhuber - TWiML Talk #44

This week we have a very special interview to share with you! Those of you who’ve been receiving my newsletter for a while might remember that while in Switzerland last month, I had the pleasure of interviewing Jurgen Schmidhuber, in his lab IDSIA, which is the Dalle Molle Institute for Artificial Intelligence Research in Lugano, Switzerland, where he serves as Scientific Director. In addition to his role at IDSIA, Jurgen is also Co-Founder and Chief Scientist of NNaisense, a company that is using AI to build large-scale neural network solutions for “superhuman perception and intelligent automation.” Jurgen is an interesting, accomplished and in some circles controversial figure in the AI community and we covered a lot of very interesting ground in our discussion, so much so that I couldn't truly unpack it all until I had a chance to sit with it after the fact. We talked a bunch about his work on neural networks, especially LSTM’s, or Long Short-Term Memory networks, which are a key innovation behind many of the advances we’ve seen in deep learning and its application over the past few years. Along the way, Jurgen walks us through a deep learning history lesson that spans 50+ years. It was like walking back in time with the 3 eyed raven. I know you’re really going to enjoy this one, and by the way, this is definitely a nerd alert show! For the show notes, visit twimlai.com/talk/44

Avsnitt(781)

AI Orchestration for Smart Cities and the Enterprise with Robin Braun and Luke Norris - #755

AI Orchestration for Smart Cities and the Enterprise with Robin Braun and Luke Norris - #755

Today, we're joined by Robin Braun, VP of AI business development for hybrid cloud at HPE, and Luke Norris, co-founder and CEO of Kamiwaza, to discuss how AI systems can be used to automate complex wo...

12 Nov 202554min

Building an AI Mathematician with Carina Hong - #754

Building an AI Mathematician with Carina Hong - #754

In this episode, Carina Hong, founder and CEO of Axiom, joins us to discuss her work building an "AI Mathematician." Carina explains why this is a pivotal moment for AI in mathematics, citing a conver...

4 Nov 202555min

High-Efficiency Diffusion Models for On-Device Image Generation and Editing with Hung Bui - #753

High-Efficiency Diffusion Models for On-Device Image Generation and Editing with Hung Bui - #753

In this episode, Hung Bui, Technology Vice President at Qualcomm, joins us to explore the latest high-efficiency techniques for running generative AI, particularly diffusion models, on-device. We dive...

28 Okt 202552min

Vibe Coding's Uncanny Valley with Alexandre Pesant - #752

Vibe Coding's Uncanny Valley with Alexandre Pesant - #752

Today, we're joined by Alexandre Pesant, AI lead at Lovable, who joins us to discuss the evolution and practice of vibe coding. Alex shares his take on how AI is enabling a shift in software developme...

22 Okt 20251h 12min

Dataflow Computing for AI Inference with Kunle Olukotun - #751

Dataflow Computing for AI Inference with Kunle Olukotun - #751

In this episode, we're joined by Kunle Olukotun, professor of electrical engineering and computer science at Stanford University and co-founder and chief technologist at Sambanova Systems, to discuss ...

14 Okt 202557min

Recurrence and Attention for Long-Context Transformers with Jacob Buckman - #750

Recurrence and Attention for Long-Context Transformers with Jacob Buckman - #750

Today, we're joined by Jacob Buckman, co-founder and CEO of Manifest AI to discuss achieving long context in transformers. We discuss the bottlenecks of scaling context length and recent techniques to...

7 Okt 202557min

The Decentralized Future of Private AI with Illia Polosukhin - #749

The Decentralized Future of Private AI with Illia Polosukhin - #749

In this episode, Illia Polosukhin, a co-author of the seminal "Attention Is All You Need" paper and co-founder of Near AI, joins us to discuss his vision for building private, decentralized, and user-...

30 Sep 20251h 5min

Inside Nano Banana 🍌 and the Future of Vision-Language Models with Oliver Wang - #748

Inside Nano Banana 🍌 and the Future of Vision-Language Models with Oliver Wang - #748

Today, we’re joined by Oliver Wang, principal scientist at Google DeepMind and tech lead for Gemini 2.5 Flash Image—better known by its code name, “Nano Banana.” We dive into the development and capab...

23 Sep 20251h 3min

Populärt inom Politik & nyheter

aftonbladet-krim
svenska-fall
p3-krim
fordomspodden
rss-expressen-dok
rss-krimstad
flashback-forever
rss-sanning-konsekvens
motiv
aftonbladet-daily
rss-vad-fan-hande
spar
rss-krimreportrarna
grans
blenda-2
rss-frandfors-horna
rss-flodet
olyckan-inifran
krimmagasinet
dagens-eko