
BERT is Shallow
Tim Niven joins us this week to discuss his work exploring the limits of what BERT can do on certain natural language tasks such as adversarial attacks, compositional learning, and systematic learning...
23 Syys 201920min

BERT is Magic
Kyle pontificates on how impressed he is with BERT.
16 Syys 201918min

Applied Data Science in Industry
Kyle sits down with Jen Stirrup to inquire about her experiences helping companies deploy data science solutions in a variety of different settings.
6 Syys 201921min

Building the howto100m Video Corpus
Video annotation is an expensive and time-consuming process. As a consequence, the available video datasets are useful but small. The availability of machine transcribed explainer videos offers a uniq...
19 Elo 201922min

BERT
Kyle provides a non-technical overview of why Bidirectional Encoder Representations from Transformers (BERT) is a powerful tool for natural language processing projects.
29 Heinä 201913min

Onnx
Kyle interviews Prasanth Pulavarthi about the Onnx format for deep neural networks.
22 Heinä 201920min

Catastrophic Forgetting
Kyle and Linhda discuss some high level theory of mind and overview the concept machine learning concept of catastrophic forgetting.
15 Heinä 201921min

Transfer Learning
Sebastian Ruder is a research scientist at DeepMind. In this episode, he joins us to discuss the state of the art in transfer learning and his contributions to it.
8 Heinä 201929min
















