The Computational Complexity of Machine Learning
Data Skeptic3 Nov 2017

The Computational Complexity of Machine Learning

In this episode, Professor Michael Kearns from the University of Pennsylvania joins host Kyle Polich to talk about the computational complexity of machine learning, complexity in game theory, and algorithmic fairness. Michael's doctoral thesis gave an early broad overview of computational learning theory, in which he emphasizes the mathematical study of efficient learning algorithms by machines or computational systems.

When we look at machine learning algorithms they are almost like meta-algorithms in some sense. For example, given a machine learning algorithm, it will look at some data and build some model, and it's going to behave presumably very differently under different inputs. But does that mean we need new analytical tools? Or is a machine learning algorithm just the same thing as any deterministic algorithm, but just a little bit more tricky to figure out anything complexity-wise? In other words, is there some overlap between the good old-fashioned analysis of algorithms with the analysis of machine learning algorithms from a complexity viewpoint? And what is the difference between strategies for determining the complexity bounds on samples versus algorithms?

A big area of machine learning (and in the analysis of learning algorithms in general) Michael and Kyle discuss is the topic known as complexity regularization. Complexity regularization asks: How should one measure the goodness of fit and the complexity of a given model? And how should one balance those two, and how can one execute that in a scalable, efficient way algorithmically? From this, Michael and Kyle discuss the broader picture of why one should care whether a learning algorithm is efficiently learnable if it's learnable in polynomial time.

Another interesting topic of discussion is the difference between sample complexity and computational complexity. An active area of research is how one should regularize their models so that they're balancing the complexity with the goodness of fit to fit their large training sample size.

As mentioned, a good resource for getting started with correlated equilibria is: https://www.cs.cornell.edu/courses/cs684/2004sp/feb20.pdf

Thanks to our sponsors:

Mendoza College of Business - Get your Masters of Science in Business Analytics from Notre Dame.

brilliant.org - A fun, affordable, online learning tool. Check out their Computer Science Algorithms course.

Det här avsnittet är hämtat från ett öppet RSS-flöde och publiceras inte av Podme. Det kan innehålla reklam.

Avsnitt(601)

Github Collaboration Network

Github Collaboration Network

In this episode we discuss the GitHub Collaboration Network with Behnaz Moradi-Jamei, assistant professor at James Madison University.  As a network scientist, Behnaz created and analyzed a network of...

11 Nov 202442min

Graphs and ML for Robotics

Graphs and ML for Robotics

We are joined by Abhishek Paudel, a PhD Student at George Mason University with a research focus on robotics, machine learning, and planning under uncertainty, using graph-based methods to enhance rob...

4 Nov 202441min

Graphs for HPC and LLMs

Graphs for HPC and LLMs

We are joined by Maciej Besta, a senior researcher of sparse graph computations and large language models at the Scalable Parallel Computing Lab (SPCL). In this episode, we explore the intersection of...

29 Okt 202452min

Graph Databases and AI

Graph Databases and AI

In this episode, we sit down with Yuanyuan Tian, a principal scientist manager at Microsoft Gray Systems Lab, to discuss the evolving role of graph databases in various industries such as fraud detect...

21 Okt 202435min

Network Analysis in Practice

Network Analysis in Practice

Our new season "Graphs and Networks" begins here!  We are joined by new co-host Asaf Shapira, a network analysis consultant and the podcaster of NETfrix – the network science podcast. Kyle and Asaf di...

14 Okt 202429min

Animal Intelligence Final Exam

Animal Intelligence Final Exam

Join us for our capstone episode on the Animal Intelligence season. We recap what we loved, what we learned, and things we wish we had gotten to spend more time on. This is a great episode to see how ...

7 Okt 202430min

Process Mining with LLMs

Process Mining with LLMs

David Obembe, a recent University of Tartu graduate, discussed his Masters thesis on integrating LLMs with process mining tools. He explained how process mining uses event logs to create maps that ide...

24 Sep 202426min

Open Animal Tracks

Open Animal Tracks

Our guest today is Risa Shinoda, a PhD student at Kyoto University Agricultural Systems Engineering Lab, where she applies computer vision techniques. She talked about the OpenAnimalTracks dataset and...

17 Sep 202422min

Populärt inom Vetenskap

allt-du-velat-veta
dumma-manniskor
p3-dystopia
rss-ufobortom-rimligt-tvivel
ufo-sverige
kapitalet-en-podd-om-ekonomi
sexet
medicinvetarna
svd-nyhetsartiklar
rss-vetenskapsradion
hacka-livet
rss-vetenskapsradion-2
paranormalt-med-caroline-giertz
det-morka-psyket
ufo-sverige-2
rss-spraket
halsorevolutionen
rss-klotet
dumforklarat
ideer-som-forandrar-varlden