Angela & Danielle — Designing ML Models for Millions of Consumer Robots

Angela & Danielle — Designing ML Models for Millions of Consumer Robots

👩‍💻👩‍💻On this episode of Gradient Dissent our guests are Angela Bassa and Danielle Dean! Angela is an expert in building and leading data teams. An MIT-trained and Edelman-award-winning mathematician, she has over 15 years of experience across industries—spanning finance, life sciences, agriculture, marketing, energy, software, and robotics. Angela heads Data Science and Machine Learning at iRobot, where her teams help bring intelligence to a global fleet of millions of consumer robots. She is also a renowned keynote speaker and author, with credits including the Wall Street Journal and Harvard Business Review. Follow Angela on twitter: https://twitter.com/angebassa And on her website: https://www.angelabassa.com/ Danielle Dean, PhD is the Technical Director of Machine Learning at iRobot where she is helping lead the intelligence revolution for robots. She leads a team that leverages machine learning, reinforcement learning, and software engineering to build algorithms that will result in massive improvements in our robots. Before iRobot, Danielle was a Principal Data Scientist Lead at Microsoft Corp. in AzureCAT Engineering within the Cloud AI Platform division. Follow Danielle on Twitter: https://twitter.com/danielleodean Check out our podcasts homepage for transcripts and more episodes! www.wandb.com/podcast 🔊 Get our podcast on Apple and Spotify! Apple Podcasts: https://bit.ly/2WdrUvI Spotify: https://bit.ly/2SqtadF We started Weights and Biases to build tools for Machine Learning practitioners because we care a lot about the impact that Machine Learning can have in the world and we love working in the trenches with the people building these models. One of the most fun things about these building tools has been the conversations with these ML practitioners and learning about the interesting things they’re working on. This process has been so fun that we wanted to open it up to the world in the form of our new podcast. We hope you have as much fun listening to it as we had making it. 👩🏼‍🚀Weights and Biases: We’re always free for academics and open source projects. Email carey@wandb.com with any questions or feature suggestions. - Blog: https://www.wandb.com/articles - Gallery: See what you can create with W&B - https://app.wandb.ai/gallery - Continue the conversation on our slack community - http://bit.ly/wandb-forum 🎙Host: Lukas Biewald - https://twitter.com/l2k 👩🏼‍💻Producer: Lavanya Shukla - https://twitter.com/lavanyaai 📹Editor: Cayla Sharp - http://caylasharp.com/

Avsnitt(131)

Sean & Greg — Biology and ML for Drug Discovery

Sean & Greg — Biology and ML for Drug Discovery

Sean McClain is the founder and CEO, and Gregory Hannum is the VP of AI Research at Absci, a biotech company that's using deep learning to expedite drug discovery and development.Lukas, Sean, and Greg talk about why Absci started investing so heavily in ML research (it all comes back to the data), what it'll take to build the GPT-3 of DNA, and where the future of pharma is headed. Sean and Greg also share some of the challenges of building cross-functional teams and combining two highly specialized fields like biology and ML.The complete show notes (transcript and links) can be found here: http://wandb.me/gd-sean-and-greg---Connect with Sean and Greg:📍 Sean's Twitter: https://twitter.com/seanrmcclain📍 Greg's Twitter: https://twitter.com/gregory_hannum📍 Absci's Twitter: https://twitter.com/abscibio---Timestamps: 0:00 Intro0:53 How Absci merges biology and AI11:24 Why Absci started investing in ML19:00 Creating the GPT-3 of DNA25:34 Investing in data collection and in ML teams33:14 Clinical trials and Absci's revenue structure38:17 Combining knowledge from different domains45:22 The potential of multitask learning50:43 Why biological data is tricky to work with55:00 Outro---Subscribe and listen to our podcast today!👉 Apple Podcasts: http://wandb.me/apple-podcasts​​👉 Google Podcasts: http://wandb.me/google-podcasts​👉 Spotify: http://wandb.me/spotify​

2 Dec 202155min

Chris, Shawn, and Lukas — The Weights & Biases Journey

Chris, Shawn, and Lukas — The Weights & Biases Journey

You might know him as the host of Gradient Dissent, but Lukas is also the CEO of Weights & Biases, a developer-first ML tools platform!In this special episode, the three W&B co-founders — Chris (CVP), Shawn (CTO), and Lukas (CEO) — sit down to tell the company's origin stories, reflect on the highs and lows, and give advice to engineers looking to start their own business.Chris reveals the W&B server architecture (tl;dr - React + GraphQL), Shawn shares his favorite product feature (it's a hidden frontend layer), and Lukas explains why it's so important to work with customers that inspire you.The complete show notes (transcript and links) can be found here: http://wandb.me/gd-wandb-cofounders---Connect with us:📍 Chris' Twitter: https://twitter.com/vanpelt📍 Shawn's Twitter: https://twitter.com/shawnup📍 Lukas' Twitter: https://twitter.com/l2k📍 W&B's Twitter: https://twitter.com/weights_biases---Timestamps: 0:00 Intro1:29 The stories behind Weights & Biases7:45 The W&B tech stack9:28 Looking back at the beginning11:42 Hallmark moments14:49 Favorite product features16:49 Rewriting the W&B backend18:21 The importance of customer feedback21:18 How Chris and Shawn have changed22:35 How the ML space has changed28:24 Staying positive when things look bleak32:19 Lukas' advice to new entrepreneurs35:29 Hopes for the next five years38:09 Making a paintbot & model understanding41:30 Biggest bottlenecks in deployment44:08 Outro44:38 Bonus: Under- vs overrated technologies---Subscribe and listen to our podcast today!👉 Apple Podcasts: http://wandb.me/apple-podcasts​​👉 Google Podcasts: http://wandb.me/google-podcasts​👉 Spotify: http://wandb.me/spotify​

5 Nov 202149min

Pete Warden — Practical Applications of TinyML

Pete Warden — Practical Applications of TinyML

Pete is the Technical Lead of the TensorFlow Micro team, which works on deep learning for mobile and embedded devices.Lukas and Pete talk about hacking a Raspberry Pi to run AlexNet, the power and size constraints of embedded devices, and techniques to reduce model size. Pete also explains real world applications of TensorFlow Lite Micro and shares what it's been like to work on TensorFlow from the beginning.The complete show notes (transcript and links) can be found here: http://wandb.me/gd-pete-warden---Connect with Pete:📍 Twitter: https://twitter.com/petewarden📍 Website: https://petewarden.com/---Timestamps: 0:00 Intro1:23 Hacking a Raspberry Pi to run neural nets13:50 Model and hardware architectures18:56 Training a magic wand21:47 Raspberry Pi vs Arduino27:51 Reducing model size33:29 Training on the edge39:47 What it's like to work on TensorFlow47:45 Improving datasets and model deployment53:05 Outro---Subscribe and listen to our podcast today!👉 Apple Podcasts: http://wandb.me/apple-podcasts​​👉 Google Podcasts: http://wandb.me/google-podcasts​👉 Spotify: http://wandb.me/spotify​

21 Okt 202153min

Pieter Abbeel — Robotics, Startups, and Robotics Startups

Pieter Abbeel — Robotics, Startups, and Robotics Startups

Pieter is the Chief Scientist and Co-founder at Covariant, where his team is building universal AI for robotic manipulation. Pieter also hosts The Robot Brains Podcast, in which he explores how far humanity has come in its mission to create conscious computers, mindful machines, and rational robots.Lukas and Pieter explore the state of affairs of robotics in 2021, the challenges of achieving consistency and reliability, and what it'll take to make robotics more ubiquitous. Pieter also shares some perspective on entrepreneurship, from how he knew it was time to commercialize Gradescope to what he looks for in co-founders to why he started Covariant.Show notes: http://wandb.me/gd-pieter-abbeel---Connect with Pieter:📍 Twitter: https://twitter.com/pabbeel📍 Website: https://people.eecs.berkeley.edu/~pabbeel/📍 The Robot Brains Podcast: https://www.therobotbrains.ai/---Timestamps: 0:00 Intro1:15 The challenges of robotics8:10 Progress in robotics13:34 Imitation learning and reinforcement learning21:37 Simulated data, real data, and reliability27:53 The increasing capabilities of robotics36:23 Entrepreneurship and co-founding Gradescope44:35 The story behind Covariant47:50 Pieter's communication tips52:13 What Pieter's currently excited about55:08 Focusing on good UI and high reliability57:01 Outro

7 Okt 202157min

Chris Albon — ML Models and Infrastructure at Wikimedia

Chris Albon — ML Models and Infrastructure at Wikimedia

In this episode we're joined by Chris Albon, Director of Machine Learning at the Wikimedia Foundation.Lukas and Chris talk about Wikimedia's approach to content moderation, what it's like to work in a place so transparent that even internal chats are public, how Wikimedia uses machine learning (spoiler: they do a lot of models to help editors), and why they're switching to Kubeflow and Docker. Chris also shares how his focus on outcomes has shaped his career and his approach to technical interviews.Show notes: http://wandb.me/gd-chris-albon---Connect with Chris:- Twitter: https://twitter.com/chrisalbon- Website: https://chrisalbon.com/---Timestamps: 0:00 Intro1:08 How Wikimedia approaches moderation9:55 Working in the open and embracing humility16:08 Going down Wikipedia rabbit holes20:03 How Wikimedia uses machine learning27:38 Wikimedia's ML infrastructure42:56 How Chris got into machine learning46:43 Machine Learning Flashcards and technical interviews52:10 Low-power models and MLOps55:58 Outro

23 Sep 202156min

Emily M. Bender — Language Models and Linguistics

Emily M. Bender — Language Models and Linguistics

In this episode, Emily and Lukas dive into the problems with bigger and bigger language models, the difference between form and meaning, the limits of benchmarks, and why it's important to name the languages we study.Show notes (links to papers and transcript): http://wandb.me/gd-emily-m-bender---Emily M. Bender is a Professor of Linguistics at and Faculty Director of the Master's Program in Computational Linguistics at University of Washington. Her research areas include multilingual grammar engineering, variation (within and across languages), the relationship between linguistics and computational linguistics, and societal issues in NLP.---Timestamps:0:00 Sneak peek, intro1:03 Stochastic Parrots9:57 The societal impact of big language models16:49 How language models can be harmful26:00 The important difference between linguistic form and meaning34:40 The octopus thought experiment42:11 Language acquisition and the future of language models49:47 Why benchmarks are limited54:38 Ways of complementing benchmarks1:01:20 The #BenderRule1:03:50 Language diversity and linguistics1:12:49 Outro

9 Sep 20211h 12min

Jeff Hammerbacher — From data science to biomedicine

Jeff Hammerbacher — From data science to biomedicine

Jeff talks about building Facebook's early data team, founding Cloudera, and transitioning into biomedicine with Hammer Lab and Related Sciences.(Read more: http://wandb.me/gd-jeff-hammerbacher)---Jeff Hammerbacher is a scientist, software developer, entrepreneur, and investor. Jeff's current work focuses on drug discovery at Related Sciences, a biotech venture creation firm that he co-founded in 2020.Prior to his work at Related Sciences, Jeff was the Principal Investigator of Hammer Lab, a founder and the Chief Scientist of Cloudera, an Entrepreneur-in-Residence at Accel, and the manager of the Data team at Facebook.---Follow Gradient Dissent on Twitter: https://twitter.com/weights_biases---0:00 Sneak peek, intro1:13 The start of Facebook's data science team6:53 Facebook's early tech stack14:20 Early growth strategies at Facebook17:37 The origin story of Cloudera24:51 Cloudera's success, in retrospect31:05 Jeff's transition into biomedicine38:38 Immune checkpoint blockade in cancer therapy48:55 Data and techniques for biomedicine53:00 Why Jeff created Related Sciences56:32 Outro

26 Aug 202156min

Josh Bloom — The Link Between Astronomy and ML

Josh Bloom — The Link Between Astronomy and ML

Josh explains how astronomy and machine learning have informed each other, their current limitations, and where their intersection goes from here. (Read more: http://wandb.me/gd-josh-bloom)---Josh is a Professor of Astronomy and Chair of the Astronomy Department at UC Berkeley. His research interests include the intersection of machine learning and physics, time-domain transients events, artificial intelligence, and optical/infared instrumentation.---Follow Gradient Dissent on Twitter: https://twitter.com/weights_biases---0:00 Intro, sneak peek1:15 How astronomy has informed ML4:20 The big questions in astronomy today10:15 On dark matter and dark energy16:37 Finding life on other planets19:55 Driving advancements in astronomy27:05 Putting telescopes in space31:05 Why Josh started using ML in his research33:54 Crowdsourcing in astronomy36:20 How ML has (and hasn't) informed astronomy47:22 The next generation of cross-functional grad students50:50 How Josh started coding56:11 Incentives and maintaining research codebases1:00:01 ML4Science's tech stack1:02:11 Uncertainty quantification in a sensor-based world1:04:28 Why it's not good to always get an answer1:07:47 Outro

20 Aug 20211h 8min

Populärt inom Business & ekonomi

badfluence
framgangspodden
varvet
rss-svart-marknad
uppgang-och-fall
rss-borsens-finest
lastbilspodden
rss-jossan-nina
rss-kort-lang-analyspodden-fran-di
affarsvarlden
24fragor
rss-inga-dumma-fragor-om-pengar
rss-en-rik-historia
rss-dagen-med-di
avanzapodden
borsmorgon
tabberaset
fill-or-kill
bathina-en-podcast
dynastin