Algorithmic Cancer: Why AI Development Is Not What You Think with Connor Leahy

Algorithmic Cancer: Why AI Development Is Not What You Think with Connor Leahy

Recently, the risks about Artificial Intelligence and the need for 'alignment' have been flooding our cultural discourse – with Artificial Super Intelligence acting as both the most promising goal and most pressing threat. But amid the moral debate, there's been surprisingly little attention paid to a basic question: do we even have the technical capability to guide where any of this is headed? And if not, should we slow the pace of innovation until we better understand how these complex systems actually work?

In this episode, Nate is joined by Artificial Intelligence developer and researcher, Connor Leahy, to discuss the rapid advancements in AI, the potential risks associated with its development, and the challenges of controlling these technologies as they evolve. Connor also explains the phenomenon of what he calls 'algorithmic cancer' – AI generated content that crowds out true human creations, propelled by algorithms that can't tell the difference. Together, they unpack the implications of AI acceleration, from widespread job disruption and energy-intensive computing to the concentration of wealth and power to tech companies.

What kinds of policy and regulatory approaches could help slow down AI's acceleration in order to create safer development pathways? Is there a world where AI becomes a tool to aid human work and creativity, rather than replacing it? And how do these AI risks connect to the deeper cultural conversation about technology's impacts on mental health, meaning, and societal well-being?

(Conversation recorded on May 21st, 2025)

About Connor Leahy:

Connor Leahy is the founder and CEO of Conjecture, which works on aligning artificial intelligence systems by building infrastructure that allows for the creation of scalable, auditable, and controllable AI.

Previously, he co-founded EleutherAI, which was one of the earliest and most successful open-source Large Language Model communities, as well as a home for early discussions on the risks of those same advanced AI systems. Prior to that, Connor worked as an AI researcher and engineer for Aleph Alpha GmbH.

Show Notes and More

Watch this video episode on YouTube

Want to learn the broad overview of The Great Simplification in 30 minutes? Watch our Animated Movie.

---

Support The Institute for the Study of Energy and Our Future

Join our Substack newsletter

Join our Discord channel and connect with other listeners

Jaksot(358)

Time Travel & The Superorganism: A Movie Idea | Frankly 81

Time Travel & The Superorganism: A Movie Idea | Frankly 81

(Recorded December 16, 2024) As we wrap up another year of thought-provoking discussions on The Great Simplification, Nate takes us on an imaginative journey in this week's Frankly - exploring a pot...

20 Joulu 202418min

The Great Simplification in Action: Building Resilience Through Local Communities with Christian Sawyer

The Great Simplification in Action: Building Resilience Through Local Communities with Christian Sawyer

(Conversation recorded on November 7th, 2024) Long-time listeners of The Great Simplification may have a good grasp of the many impending crises that humanity faces. But once we understand the sco...

18 Joulu 20241h 10min

"Thank You for Ruining My Life" | Frankly 80

"Thank You for Ruining My Life" | Frankly 80

(Recorded December 5, 2024) It's not everyday that a stranger thanks you for 'ruining their life'. In this heartfelt Frankly, Nate reflects on a powerful encounter with a venture capitalist whose li...

13 Joulu 202411min

The Baby Bust: How The Toxicity Crisis Could Cause the Next Economic Crash with Jeremy Grantham

The Baby Bust: How The Toxicity Crisis Could Cause the Next Economic Crash with Jeremy Grantham

(Conversation recorded on November 5th, 2024) It is no secret that population dynamics significantly impact global stability. But what's really behind today's shifting global birth trends, the inc...

11 Joulu 20241h 47min

Shutting Off The Plastic Tap: A Global Treaty To Regulate Petrochemical Pollution? with Jane Muncke

Shutting Off The Plastic Tap: A Global Treaty To Regulate Petrochemical Pollution? with Jane Muncke

(Conversation recorded on December 2nd, 2024) One of the central ecological challenges of our time is addressing the plastic and petrochemical pollution that has exploded over the past several decad...

8 Joulu 202447min

The Biggest Takeaways from the Logic of the Superorganism

The Biggest Takeaways from the Logic of the Superorganism

(Recorded November 26, 2024) As we piece together the different facets of our reality, the systems synthesis which emerges confronts us with some uncomfortable truths. These are the advanced inferen...

6 Joulu 202420min

Existential Risks: The Biggest Threats to Life as We Know It with Luke Kemp

Existential Risks: The Biggest Threats to Life as We Know It with Luke Kemp

(Conversation recorded on October 22nd, 2024) The human system as we know it today – which powers our economies, global supply chains, and social contracts – is a fragile network based on innumera...

4 Joulu 20241h 41min

A Brief Clarification on Human Behavior | Frankly 78

A Brief Clarification on Human Behavior | Frankly 78

(Recorded November 21, 2024) Two weeks ago, in a Frankly called The Battles of Our Time, Nate commented on human behavior and said that, in today's world, only three to four percent of humans are go...

29 Marras 20249min

Suosittua kategoriassa Tiede

rss-mita-tulisi-tietaa
rss-poliisin-mieli
rss-duodecim-lehti
tiedekulma-podcast
rss-lihavuudesta-podcast
utelias-mieli
docemilia
mielipaivakirja
radio-antro
rss-opeklubi
sotataidon-ytimessa
hippokrateen-vastaanotolla
rss-laakaripodi
rss-mental-race
rss-luontopodi-samuel-glassar-tutkii-luonnon-ihmeita
rss-sosiopodi