Annotator Bias
Data Skeptic23 Nov 2019

Annotator Bias

The modern deep learning approaches to natural language processing are voracious in their demands for large corpora to train on. Folk wisdom estimates used to be around 100k documents were required for effective training. The availability of broadly trained, general-purpose models like BERT has made it possible to do transfer learning to achieve novel results on much smaller corpora.

Thanks to these advancements, an NLP researcher might get value out of fewer examples since they can use the transfer learning to get a head start and focus on learning the nuances of the language specifically relevant to the task at hand. Thus, small specialized corpora are both useful and practical to create.

In this episode, Kyle speaks with Mor Geva, lead author on the recent paper Are We Modeling the Task or the Annotator? An Investigation of Annotator Bias in Natural Language Understanding Datasets, which explores some unintended consequences of the typical procedure followed for generating corpora.

Source code for the paper available here: https://github.com/mega002/annotator_bias

Episoder(590)

Building the howto100m Video Corpus

Building the howto100m Video Corpus

Video annotation is an expensive and time-consuming process. As a consequence, the available video datasets are useful but small. The availability of machine transcribed explainer videos offers a unique opportunity to rapidly develop a useful, if dirty, corpus of videos that are "self annotating", as hosts explain the actions they are taking on the screen. This episode is a discussion of the HowTo100m dataset - a project which has assembled a video corpus of 136M video clips with captions covering 23k activities. Related Links The paper will be presented at ICCV 2019 @antoine77340 Antoine on Github Antoine's homepage

19 Aug 201922min

BERT

BERT

Kyle provides a non-technical overview of why Bidirectional Encoder Representations from Transformers (BERT) is a powerful tool for natural language processing projects.

29 Jul 201913min

Onnx

Onnx

Kyle interviews Prasanth Pulavarthi about the Onnx format for deep neural networks.

22 Jul 201920min

Catastrophic Forgetting

Catastrophic Forgetting

Kyle and Linhda discuss some high level theory of mind and overview the concept machine learning concept of catastrophic forgetting.

15 Jul 201921min

Transfer Learning

Transfer Learning

Sebastian Ruder is a research scientist at DeepMind.  In this episode, he joins us to discuss the state of the art in transfer learning and his contributions to it.

8 Jul 201929min

Facebook Bargaining Bots Invented a Language

Facebook Bargaining Bots Invented a Language

In 2017, Facebook published a paper called Deal or No Deal? End-to-End Learning for Negotiation Dialogues. In this research, the reinforcement learning agents developed a mechanism of communication (which could be called a language) that made them able to optimize their scores in the negotiation game. Many media sources reported this as if it were a first step towards Skynet taking over. In this episode, Kyle discusses bargaining agents and the actual results of this research.

21 Jun 201923min

Under Resourced Languages

Under Resourced Languages

Priyanka Biswas joins us in this episode to discuss natural language processing for languages that do not have as many resources as those that are more commonly studied such as English.  Successful NLP projects benefit from the availability of like large corpora, well-annotated corpora, software libraries, and pre-trained models.  For languages that researchers have not paid as much attention to, these tools are not always available.

15 Jun 201916min

Named Entity Recognition

Named Entity Recognition

Kyle and Linh Da discuss the class of approaches called "Named Entity Recognition" or NER.  NER algorithms take any string as input and return a list of "entities" - specific facts and agents in the text along with a classification of the type (e.g. person, date, place).

8 Jun 201917min

Populært innen Vitenskap

fastlegen
fremtid-pa-frys
rekommandert
tingenes-tilstand
rss-rekommandert
jss
sinnsyn
forskningno
tomprat-med-gunnar-tjomlid
villmarksliv
rss-paradigmepodden
rss-overskuddsliv
vett-og-vitenskap-med-gaute-einevoll
fjellsportpodden
doktor-fives-podcast
nordnorsk-historie
dekodet-2
pod-britannia
tidlose-historier
nevropodden