
seq2seq
A sequence to sequence (or seq2seq) model is neural architecture used for translation (and other tasks) which consists of an encoder and a decoder. The encoder/decoder architecture has obvious promise...
1 Mars 201921min

Text Mining in R
Kyle interviews Julia Silge about her path into data science, her book Text Mining with R, and some of the ways in which she's used natural language processing in projects both personal and profession...
22 Feb 201920min

Recurrent Relational Networks
One of the most challenging NLP tasks is natural language understanding and reasoning. How can we construct algorithms that are able to achieve human level understanding of text and be able to answer ...
15 Feb 201919min

Text World and Word Embedding Lower Bounds
In the first half of this episode, Kyle speaks with Marc-Alexandre Côté and Wendy Tay about Text World. Text World is an engine that simulates text adventure games. Developers are encouraged to try ...
8 Feb 201939min

word2vec
Word2vec is an unsupervised machine learning model which is able to capture semantic information from the text it is trained on. The model is based on neural networks. Several large organizations like...
1 Feb 201931min

Authorship Attribution
In a recent paper, Leveraging Discourse Information Effectively for Authorship Attribution, authors Su Wang, Elisa Ferracane, and Raymond J. Mooney describe a deep learning methodology for predict whi...
25 Jan 201950min

Very Large Corpora and Zipf's Law
The earliest efforts to apply machine learning to natural language tended to convert every token (every word, more or less) into a unique feature. While techniques like stemming may have cut the numbe...
18 Jan 201924min

Semantic search at Github
Github is many things besides source control. It's a social network, even though not everyone realizes it. It's a vast repository of code. It's a ticketing and project management system. And of course...
11 Jan 201934min

















