
ML Ops
Kyle met up with Damian Brady at MS Ignite 2019 to discuss machine learning operations.
27 Marras 201936min

Annotator Bias
The modern deep learning approaches to natural language processing are voracious in their demands for large corpora to train on. Folk wisdom estimates used to be around 100k documents were required f...
23 Marras 201925min

NLP for Developers
While at MS Build 2019, Kyle sat down with Lance Olson from the Applied AI team about how tools like cognitive services and cognitive search enable non-data scientists to access relatively advanced NL...
20 Marras 201929min

Indigenous American Language Research
Manuel Mager joins us to discuss natural language processing for low and under-resourced languages. We discuss current work in this area and the Naki Project which aggregates research on NLP for nati...
13 Marras 201922min

Talking to GPT-2
GPT-2 is yet another in a succession of models like ELMo and BERT which adopt a similar deep learning architecture and train an unsupervised model on a massive text corpus. As we have been covering re...
31 Loka 201929min

Reproducing Deep Learning Models
Rajiv Shah attempted to reproduce an earthquake-predicting deep learning model. His results exposed some issues with the model. Kyle and Rajiv discuss the original paper and Rajiv's analysis.
23 Loka 201922min

What BERT is Not
Allyson Ettinger joins us to discuss her work in computational linguistics, specifically in exploring some of the ways in which the popular natural language processing approach BERT has limitations.
14 Loka 201927min

SpanBERT
Omer Levy joins us to discuss "SpanBERT: Improving Pre-training by Representing and Predicting Spans". https://arxiv.org/abs/1907.10529
8 Loka 201924min
















