[MINI] One Shot Learning
Data Skeptic22 Sep 2017

[MINI] One Shot Learning

One Shot Learning is the class of machine learning procedures that focuses learning something from a small number of examples. This is in contrast to "traditional" machine learning which typically requires a very large training set to build a reasonable model.

In this episode, Kyle presents a coded message to Linhda who is able to recognize that many of these new symbols created are likely to be the same symbol, despite having extremely few examples of each. Why can the human brain recognize a new symbol with relative ease while most machine learning algorithms require large training data? We discuss some of the reasons why and approaches to One Shot Learning.

Denne episoden er hentet fra en åpen RSS-feed og er ikke publisert av Podme. Den kan derfor inneholde annonser.

Episoder(601)

Facebook Bargaining Bots Invented a Language

Facebook Bargaining Bots Invented a Language

In 2017, Facebook published a paper called Deal or No Deal? End-to-End Learning for Negotiation Dialogues. In this research, the reinforcement learning agents developed a mechanism of communication (w...

21 Jun 201923min

Under Resourced Languages

Under Resourced Languages

Priyanka Biswas joins us in this episode to discuss natural language processing for languages that do not have as many resources as those that are more commonly studied such as English.  Successful NL...

15 Jun 201916min

Named Entity Recognition

Named Entity Recognition

Kyle and Linh Da discuss the class of approaches called "Named Entity Recognition" or NER.  NER algorithms take any string as input and return a list of "entities" - specific facts and agents in the t...

8 Jun 201917min

The Death of a Language

The Death of a Language

USC students from the CAIS++ student organization have created a variety of novel projects under the mission statement of "artificial intelligence for social good". In this episode, Kyle interviews Za...

1 Jun 201920min

Neural Turing Machines

Neural Turing Machines

Kyle and Linh Da discuss the concepts behind the neural Turing machine.

25 Mai 201925min

Data Infrastructure in the Cloud

Data Infrastructure in the Cloud

Kyle chats with Rohan Kumar about hyperscale, data at the edge, and a variety of other trends in data engineering in the cloud.

18 Mai 201930min

NCAA Predictions on Spark

NCAA Predictions on Spark

In this episode, Kyle interviews Laura Edell at MS Build 2019.  The conversation covers a number of topics, notably her NCAA Final 4 prediction model.

11 Mai 201923min

The Transformer

The Transformer

Kyle and Linhda discuss attention and the transformer - an encoder/decoder architecture that extends the basic ideas of vector embeddings like word2vec into a more contextual use case.

3 Mai 201915min

Populært innen Vitenskap

fastlegen
tingenes-tilstand
liberal-halvtime
jss
rss-zahid-ali-hjelper-deg
sinnsyn
forskningno
villmarksliv
rekommandert
rss-overskuddsliv
rss-paradigmepodden
vett-og-vitenskap-med-gaute-einevoll
nordnorsk-historie
tidlose-historier
dekodet-2
rss-inn-til-kjernen-med-sunniva-rose
fjellsportpodden
kvinnehelsepodden
diagnose
rss-nysgjerrige-norge