Cade Metz — The Stories Behind the Rise of AI

Cade Metz — The Stories Behind the Rise of AI

How Cade got access to the stories behind some of the biggest advancements in AI, and the dynamic playing out between leaders at companies like Google, Microsoft, and Facebook. Cade Metz is a New York Times reporter covering artificial intelligence, driverless cars, robotics, virtual reality, and other emerging areas. Previously, he was a senior staff writer with Wired magazine and the U.S. editor of The Register, one of Britain’s leading science and technology news sites. His first book, "Genius Makers", tells the stories of the pioneers behind AI. Get the book: http://bit.ly/GeniusMakers Follow Cade on Twitter: https://twitter.com/CadeMetz/ And on Linkedin: https://www.linkedin.com/in/cademetz/ Topics discussed: 0:00 sneak peek, intro 3:25 audience and charachters 7:18 *spoiler alert* AGI 11:01 book ends, but story goes on 17:31 overinflated claims in AI 23:12 Deep Mind, OpenAI, building AGI 29:02 neuroscience and psychology, outsiders 34:35 Early adopters of ML 38:34 WojNet, where is credit due? 42:45 press covering AI 46:38 Aligning technology and need Read the transcript and discover awesome ML projects: http://wandb.me/cade-metz Get our podcast on these platforms: Apple Podcasts: http://wandb.me/apple-podcasts Spotify: http://wandb.me/spotify Google: http://wandb.me/google-podcasts YouTube: http://wandb.me/youtube Soundcloud: http://wandb.me/soundcloud Tune in to our bi-weekly virtual salon and listen to industry leaders and researchers in machine learning share their research: http://wandb.me/salon Join our community of ML practitioners where we host AMA's, share interesting projects and meet other people working in Deep Learning: http://wandb.me/slack Our gallery features curated machine learning reports by researchers exploring deep learning techniques, Kagglers showcasing winning models, and industry leaders sharing best practices: https://wandb.ai/gallery

Avsnitt(131)

Xavier Amatriain — Building AI-powered Primary Care

Xavier Amatriain — Building AI-powered Primary Care

Xavier shares his experience deploying healthcare models, augmenting primary care with AI, the challenges of "ground truth" in medicine, and robustness in ML. --- Xavier Amatriain is co-founder and CTO of Curai, an ML-based primary care chat system. Previously, he was VP of Engineering at Quora, and Research/Engineering Director at Neflix, where he started and led the Algorithms team responsible for Netflix's recommendation systems. --- ⏳ Timestamps: 0:00 Sneak peak, intro 0:49 What is Curai? 5:48 The role of AI within Curai 8:44 Why Curai keeps humans in the loop 15:00 Measuring diagnostic accuracy 18:53 Patient safety 22:39 Different types of models at Curai 25:42 Using GPT-3 to generate training data 32:13 How Curai monitors and debugs models 35:19 Model explainability 39:27 Robustness in ML 45:52 Connecting metrics to impact 49:32 Outro 🌟 Show notes: - http://wandb.me/gd-xavier-amatriain - Transcription of the episode - Links to papers, projects, and people --- Follow us on Twitter! 📍 https://twitter.com/wandb_gd Get our podcast on these platforms: 👉 Apple Podcasts: http://wandb.me/apple-podcasts​​ 👉 Spotify: http://wandb.me/spotify​ 👉 Google Podcasts: http://wandb.me/google-podcasts​​ 👉 YouTube: http://wandb.me/youtube​​ 👉 Soundcloud: http://wandb.me/soundcloud​

30 Juli 202150min

Spence Green — Enterprise-scale Machine Translation

Spence Green — Enterprise-scale Machine Translation

Spence shares his experience creating a product around human-in-the-loop machine translation, and explains how machine translation has evolved over the years. --- Spence Green is co-founder and CEO of Lilt, an AI-powered language translation platform. Lilt combines human translators and machine translation in order to produce high-quality translations more efficiently. --- 🌟 Show notes: - http://wandb.me/gd-spence-green - Transcription of the episode - Links to papers, projects, and people ⏳ Timestamps: 0:00 Sneak peak, intro 0:45 The story behind Lilt 3:08 Statistical MT vs neural MT 6:30 Domain adaptation and personalized models 8:00 The emergence of neural MT and development of Lilt 13:09 What success looks like for Lilt 18:20 Models that self-correct for gender bias 19:39 How Lilt runs its models in production 26:33 How far can MT go? 29:55 Why Lilt cares about human-computer interaction 35:04 Bilingual grammatical error correction 37:18 Human parity in MT 39:41 The unexpected challenges of prototype to production --- Get our podcast on these platforms: 👉 Apple Podcasts: http://wandb.me/apple-podcasts​​ 👉 Spotify: http://wandb.me/spotify​ 👉 Google Podcasts: http://wandb.me/google-podcasts​​ 👉 YouTube: http://wandb.me/youtube​​ 👉 Soundcloud: http://wandb.me/soundcloud​ Join our community of ML practitioners where we host AMAs, share interesting projects and meet other people working in Deep Learning: http://wandb.me/slack​​ Check out Fully Connected, which features curated machine learning reports by researchers exploring deep learning techniques, Kagglers showcasing winning models, industry leaders sharing best practices, and more: https://wandb.ai/fully-connected

16 Juli 202143min

Roger & DJ — The Rise of Big Data and CA's COVID-19 Response

Roger & DJ — The Rise of Big Data and CA's COVID-19 Response

Roger and DJ share some of the history behind data science as we know it today, and reflect on their experiences working on California's COVID-19 response. --- Roger Magoulas is Senior Director of Data Strategy at Astronomer, where he works on data infrastructure, analytics, and community development. Previously, he was VP of Research at O'Reilly and co-chair of O'Reilly's Strata Data and AI Conference. DJ Patil is a board member and former CTO of Devoted Health, a healthcare company for seniors. He was also Chief Data Scientist under the Obama administration and the Head of Data Science at LinkedIn. Roger and DJ recently volunteered for the California COVID-19 response, and worked with data to understand case counts, bed capacities and the impact of intervention. Connect with Roger and DJ: 📍 Roger's Twitter: https://twitter.com/rogerm 📍 DJ's Twitter: https://twitter.com/dpatil --- 🌟 Transcript: http://wandb.me/gd-roger-and-dj 🌟 ⏳ Timestamps: 0:00 Sneak peek, intro 1:03 Coining the terms "big data" and "data scientist" 7:12 The rise of data science teams 15:28 Big Data, Hadoop, and Spark 23:10 The importance of using the right tools 29:20 BLUF: Bottom Line Up Front 34:44 California's COVID response 41:21 The human aspects of responding to COVID 48:33 Reflecting on the impact of COVID interventions 57:06 Advice on doing meaningful data science work 1:04:18 Outro 🍀 Links: 1. "MapReduce: Simplified Data Processing on Large Clusters" (Dean and Ghemawat, 2004): https://research.google/pubs/pub62/ 2. "Big Data: Technologies and Techniques for Large-Scale Data" (Magoulas and Lorica, 2009): https://academics.uccs.edu/~ooluwada/courses/datamining/ExtraReading/BigData 3. The O'RLY book covers: https://www.businessinsider.com/these-hilarious-memes-perfectly-capture-what-its-like-to-work-in-tech-2016-4 4. "The Premonition" (Lewis, 2021): https://www.npr.org/2021/05/03/991570372/michael-lewis-the-premonition-is-a-sweeping-indictment-of-the-cdc 5. Why California's beaches are glowing with bioluminescence: https://www.youtube.com/watch?v=AVYSr19ReOs 6. 7. Sturgis Motorcyle Rally: https://en.wikipedia.org/wiki/Sturgis_Motorcycle_Rally --- Get our podcast on these platforms: 👉 Apple Podcasts: http://wandb.me/apple-podcasts​​ 👉 Spotify: http://wandb.me/spotify​ 👉 Google Podcasts: http://wandb.me/google-podcasts​​ 👉 YouTube: http://wandb.me/youtube​​ 👉 Soundcloud: http://wandb.me/soundcloud​ Join our community of ML practitioners where we host AMAs, share interesting projects and meet other people working in Deep Learning: http://wandb.me/slack​​ Check out Fully Connected, which features curated machine learning reports by researchers exploring deep learning techniques, Kagglers showcasing winning models, industry leaders sharing best practices, and more: https://wandb.ai/fully-connected

8 Juli 20211h 4min

Amelia & Filip — How Pandora Deploys ML Models into Production

Amelia & Filip — How Pandora Deploys ML Models into Production

Amelia and Filip give insights into the recommender systems powering Pandora, from developing models to balancing effectiveness and efficiency in production. --- Amelia Nybakke is a Software Engineer at Pandora. Her team is responsible for the production system that serves models to listeners. Filip Korzeniowski is a Senior Scientist at Pandora working on recommender systems. Before that, he was a PhD student working on deep neural networks for acoustic and language modeling applied to musical audio recordings. Connect with Amelia and Filip: 📍 Amelia's LinkedIn: https://www.linkedin.com/in/amelia-nybakke-60bba5107/ 📍 Filip's LinkedIn: https://www.linkedin.com/in/filip-korzeniowski-28b33815a/ --- ⏳ Timestamps: 0:00 Sneak peek, intro 0:42 What type of ML models are at Pandora? 3:39 What makes two songs similar or not similar? 7:33 Improving models and A/B testing 8:52 Chaining, retraining, versioning, and tracking models 13:29 Useful development tools 15:10 Debugging models 18:28 Communicating progress 20:33 Tuning and improving models 23:08 How Pandora puts models into production 29:45 Bias in ML models 36:01 Repetition vs novelty in recommended songs 38:01 The bottlenecks of deployment 🌟 Transcript: http://wandb.me/gd-amelia-and-filip 🌟 Links: 📍 Amelia's "Women's History Month" playlist: https://www.pandora.com/playlist/PL:1407374934299927:100514833 --- Get our podcast on these platforms: 👉 Apple Podcasts: http://wandb.me/apple-podcasts​​ 👉 Spotify: http://wandb.me/spotify​ 👉 Google Podcasts: http://wandb.me/google-podcasts​​ 👉 YouTube: http://wandb.me/youtube​​ 👉 Soundcloud: http://wandb.me/soundcloud​ Join our community of ML practitioners where we host AMAs, share interesting projects and meet other people working in Deep Learning: http://wandb.me/slack​​ Check out Fully Connected, which features curated machine learning reports by researchers exploring deep learning techniques, Kagglers showcasing winning models, industry leaders sharing best practices, and more: https://wandb.ai/fully-connected

1 Juli 202140min

Luis Ceze — Accelerating Machine Learning Systems

Luis Ceze — Accelerating Machine Learning Systems

From Apache TVM to OctoML, Luis gives direct insight into the world of ML hardware optimization, and where systems optimization is heading. --- Luis Ceze is co-founder and CEO of OctoML, co-author of the Apache TVM Project, and Professor of Computer Science and Engineering at the University of Washington. His research focuses on the intersection of computer architecture, programming languages, machine learning, and molecular biology. Connect with Luis: 📍 Twitter: https://twitter.com/luisceze 📍 University of Washington profile: https://homes.cs.washington.edu/~luisceze/ --- ⏳ Timestamps: 0:00 Intro and sneak peek 0:59 What is TVM? 8:57 Freedom of choice in software and hardware stacks 15:53 How new libraries can improve system performance 20:10 Trade-offs between efficiency and complexity 24:35 Specialized instructions 26:34 The future of hardware design and research 30:03 Where does architecture and research go from here? 30:56 The environmental impact of efficiency 32:49 Optimizing and trade-offs 37:54 What is OctoML and the Octomizer? 42:31 Automating systems design with and for ML 44:18 ML and molecular biology 46:09 The challenges of deployment and post-deployment 🌟 Transcript: http://wandb.me/gd-luis-ceze 🌟 Links: 1. OctoML: https://octoml.ai/ 2. Apache TVM: https://tvm.apache.org/ 3. "Scalable and Intelligent Learning Systems" (Chen, 2019): https://digital.lib.washington.edu/researchworks/handle/1773/44766 4. "Principled Optimization Of Dynamic Neural Networks" (Roesch, 2020): https://digital.lib.washington.edu/researchworks/handle/1773/46765 5. "Cross-Stack Co-Design for Efficient and Adaptable Hardware Acceleration" (Moreau, 2018): https://digital.lib.washington.edu/researchworks/handle/1773/43349 6. "TVM: An Automated End-to-End Optimizing Compiler for Deep Learning" (Chen et al., 2018): https://www.usenix.org/system/files/osdi18-chen.pdf 7. Porcupine is a molecular tagging system introduced in "Rapid and robust assembly and decoding of molecular tags with DNA-based nanopore signatures" (Doroschak et al., 2020): https://www.nature.com/articles/s41467-020-19151-8 --- Get our podcast on these platforms: 👉 Apple Podcasts: http://wandb.me/apple-podcasts​​ 👉 Spotify: http://wandb.me/spotify​ 👉 Google Podcasts: http://wandb.me/google-podcasts​​ 👉 YouTube: http://wandb.me/youtube​​ 👉 Soundcloud: http://wandb.me/soundcloud​ Join our community of ML practitioners where we host AMAs, share interesting projects and meet other people working in Deep Learning: http://wandb.me/slack​​ Check out Fully Connected, which features curated machine learning reports by researchers exploring deep learning techniques, Kagglers showcasing winning models, industry leaders sharing best practices, and more: https://wandb.ai/fully-connected

24 Juni 202148min

Matthew Davis — Bringing Genetic Insights to Everyone

Matthew Davis — Bringing Genetic Insights to Everyone

Matthew explains how combining machine learning and computational biology can provide mainstream medicine with better diagnostics and insights. --- Matthew Davis is Head of AI at Invitae, the largest and fastest growing genetic testing company in the world. His research includes bioinformatics, computational biology, NLP, reinforcement learning, and information retrieval. Matthew was previously at IBM Research AI, where he led a research team focused on improving AI systems. Connect with Matthew: 📍 Personal website: https://www.linkedin.com/in/matthew-davis-51233386/ 📍 Twitter: https://twitter.com/deadsmiths --- ⏳ Timestamps: 0:00 Sneak peek, intro 1:02 What is Invitae? 2:58 Why genetic testing can help everyone 7:51 How Invitae uses ML techniques 14:02 Modeling molecules and deciding which genes to look at 22:22 NLP applications in bioinformatics 27:10 Team structure at Invitae 36:50 Why reasoning is an underrated topic in ML 40:25 Why having a clear buy-in is important 🌟 Transcript: http://wandb.me/gd-matthew-davis 🌟 Links: 📍 Invitae: https://www.invitae.com/en 📍 Careers at Invitae: https://www.invitae.com/en/careers/ --- Get our podcast on these platforms: 👉 Apple Podcasts: http://wandb.me/apple-podcasts​​ 👉 Spotify: http://wandb.me/spotify​ 👉 Google Podcasts: http://wandb.me/google-podcasts​​ 👉 YouTube: http://wandb.me/youtube​​ 👉 Soundcloud: http://wandb.me/soundcloud​ Join our community of ML practitioners where we host AMAs, share interesting projects and meet other people working in Deep Learning: http://wandb.me/slack​​ Check out Fully Connected, which features curated machine learning reports by researchers exploring deep learning techniques, Kagglers showcasing winning models, industry leaders sharing best practices, and more: https://wandb.ai/fully-connected

17 Juni 202143min

Clément Delangue — The Power of the Open Source Community

Clément Delangue — The Power of the Open Source Community

Clem explains the virtuous cycles behind the creation and success of Hugging Face, and shares his thoughts on where NLP is heading. --- Clément Delangue is co-founder and CEO of Hugging Face, the AI community building the future. Hugging Face started as an open source NLP library and has quickly grown into a commercial product used by over 5,000 companies. Connect with Clem: 📍 Twitter: https://twitter.com/ClementDelangue 📍 LinkedIn: https://www.linkedin.com/in/clementdelangue/ --- 🌟 Transcript: http://wandb.me/gd-clement-delangue 🌟 ⏳ Timestamps: 0:00 Sneak peek and intro 0:56 What is Hugging Face? 4:15 The success of Hugging Face Transformers 7:53 Open source and virtuous cycles 10:37 Working with both TensorFlow and PyTorch 13:20 The "Write With Transformer" project 14:36 Transfer learning in NLP 16:43 BERT and DistilBERT 22:33 GPT 26:32 The power of the open source community 29:40 Current applications of NLP 35:15 The Turing Test and conversational AI 41:19 Why speech is an upcoming field within NLP 43:44 The human challenges of machine learning Links Discussed: 📍 Write With Transformer, Hugging Face Transformer's text generation demo: https://transformer.huggingface.co/ 📍 "Attention Is All You Need" (Vaswani et al., 2017): https://arxiv.org/abs/1706.03762 📍 EleutherAI and GPT-Neo: https://github.com/EleutherAI/gpt-neo] 📍 Rasa, open source conversational AI: https://rasa.com/ 📍 Roblox article on BERT: https://blog.roblox.com/2020/05/scaled-bert-serve-1-billion-daily-requests-cpus/ --- Get our podcast on these platforms: 👉 Apple Podcasts: http://wandb.me/apple-podcasts​​ 👉 Spotify: http://wandb.me/spotify​ 👉 Google Podcasts: http://wandb.me/google-podcasts​​ 👉 YouTube: http://wandb.me/youtube​​ 👉 Soundcloud: http://wandb.me/soundcloud​ Join our community of ML practitioners where we host AMAs, share interesting projects and meet other people working in Deep Learning: http://wandb.me/slack​​ Check out Fully Connected, which features curated machine learning reports by researchers exploring deep learning techniques, Kagglers showcasing winning models, industry leaders sharing best practices, and more: https://wandb.ai/fully-connected

10 Juni 202146min

Wojciech Zaremba — What Could Make AI Conscious?

Wojciech Zaremba — What Could Make AI Conscious?

Wojciech joins us to talk the principles behind OpenAI, the Fermi Paradox, and the future stages of developments in AGI. --- Wojciech Zaremba is a co-founder of OpenAI, a research company dedicated to discovering and enacting the path to safe artificial general intelligence. He was also Head of Robotics, where his team developed general-purpose robots through new approaches to transfer learning, and taught robots complex behaviors. Connect with Wojciech: Personal website: https://wojzaremba.com// Twitter: https://twitter.com/woj_zaremba --- Topics Discussed: 0:00 Sneak peek and intro 1:03 The people and principles behind OpenAI 6:31 The stages of future AI developments 13:42 The Fermi paradox 16:18 What drives Wojciech? 19:17 Thoughts on robotics 24:58 Dota and other projects at OpenAI 33:42 What would make an AI conscious? 41:31 How to be succeed in robotics Transcript: http://wandb.me/gd-wojciech-zaremba Links: Fermi paradox: https://en.wikipedia.org/wiki/Fermi_paradox OpenAI and Dota: https://openai.com/projects/five/ --- Get our podcast on these platforms: Apple Podcasts: http://wandb.me/apple-podcasts​​ Spotify: http://wandb.me/spotify​ Google Podcasts: http://wandb.me/google-podcasts​​ YouTube: http://wandb.me/youtube​​ Soundcloud: http://wandb.me/soundcloud​ Join our community of ML practitioners where we host AMAs, share interesting projects and meet other people working in Deep Learning: http://wandb.me/slack​​ Check out Fully Connected, which features curated machine learning reports by researchers exploring deep learning techniques, Kagglers showcasing winning models, industry leaders sharing best practices, and more: https://wandb.ai/fully-connected

3 Juni 202144min

Populärt inom Business & ekonomi

badfluence
framgangspodden
varvet
rss-svart-marknad
uppgang-och-fall
rss-borsens-finest
lastbilspodden
rss-jossan-nina
rss-kort-lang-analyspodden-fran-di
affarsvarlden
24fragor
rss-inga-dumma-fragor-om-pengar
rss-en-rik-historia
rss-dagen-med-di
avanzapodden
borsmorgon
tabberaset
fill-or-kill
bathina-en-podcast
dynastin