Algorithmic Cancer: Why AI Development Is Not What You Think with Connor Leahy

Algorithmic Cancer: Why AI Development Is Not What You Think with Connor Leahy

Recently, the risks about Artificial Intelligence and the need for 'alignment' have been flooding our cultural discourse – with Artificial Super Intelligence acting as both the most promising goal and most pressing threat. But amid the moral debate, there's been surprisingly little attention paid to a basic question: do we even have the technical capability to guide where any of this is headed? And if not, should we slow the pace of innovation until we better understand how these complex systems actually work?

In this episode, Nate is joined by Artificial Intelligence developer and researcher, Connor Leahy, to discuss the rapid advancements in AI, the potential risks associated with its development, and the challenges of controlling these technologies as they evolve. Connor also explains the phenomenon of what he calls 'algorithmic cancer' – AI generated content that crowds out true human creations, propelled by algorithms that can't tell the difference. Together, they unpack the implications of AI acceleration, from widespread job disruption and energy-intensive computing to the concentration of wealth and power to tech companies.

What kinds of policy and regulatory approaches could help slow down AI's acceleration in order to create safer development pathways? Is there a world where AI becomes a tool to aid human work and creativity, rather than replacing it? And how do these AI risks connect to the deeper cultural conversation about technology's impacts on mental health, meaning, and societal well-being?

(Conversation recorded on May 21st, 2025)

About Connor Leahy:

Connor Leahy is the founder and CEO of Conjecture, which works on aligning artificial intelligence systems by building infrastructure that allows for the creation of scalable, auditable, and controllable AI.

Previously, he co-founded EleutherAI, which was one of the earliest and most successful open-source Large Language Model communities, as well as a home for early discussions on the risks of those same advanced AI systems. Prior to that, Connor worked as an AI researcher and engineer for Aleph Alpha GmbH.

Show Notes and More

Watch this video episode on YouTube

Want to learn the broad overview of The Great Simplification in 30 minutes? Watch our Animated Movie.

---

Support The Institute for the Study of Energy and Our Future

Join our Substack newsletter

Join our Discord channel and connect with other listeners

Episoder(359)

Leon Simons: "Aerosol Demasking & Global Heating"

Leon Simons: "Aerosol Demasking & Global Heating"

On this episode, Nate is joined by climate researcher Leon Simons to unpack recent trends in global heating during 2023 and potential explanations and subsequent projections for the coming year. While...

17 Jan 20241h 24min

Jane Muncke: "Perils of Plastic Packaging"

Jane Muncke: "Perils of Plastic Packaging"

On this episode, toxicology scientist Dr. Jane Muncke joins Nate to discuss the current state of food production and the effects of ultra processed foods and their packaging on our health. Over the la...

10 Jan 20241h 19min

The Behavioral Stack | Frankly #52

The Behavioral Stack | Frankly #52

Recorded December 18 2023 Description In this Frankly, Nate offers a personal reflection on his learnings about 'awareness' vs 'focus' and how this knowledge could be used as a guide toward more t...

5 Jan 202418min

Peter Brannen: "Deep Time, Mass Extinctions, and Today"

Peter Brannen: "Deep Time, Mass Extinctions, and Today"

On this episode, Nate is joined by Peter Brannen, science journalist and author specializing in Earth's prior mass extinctions, to unpack our planet's geologic history and what it can tell us about ou...

3 Jan 20241h 42min

Bill McKibben: "Climate, Movements, and Power"

Bill McKibben: "Climate, Movements, and Power"

On this episode, environmental activist and author Bill McKibben joins Nate for a reflection on the last few decades of climate education and movements – and the possibilities and challenges that we'l...

20 Des 20231h 19min

Systemic Themes for 2024 | Frankly #51

Systemic Themes for 2024 | Frankly #51

Recorded December 17 2023 Description In this final Frankly of 2023, Nate outlines some global themes that are worth keeping an eye on in 2024. From climate change to domestic and global politics ...

18 Des 202314min

Arthur Berman: "Shale Oil and the Slurping Sound"

Arthur Berman: "Shale Oil and the Slurping Sound"

On this episode, Arthur Berman returns to unpack the complexity underpinning the oil trends of the last 75 years and what new data can tell us about availability in the coming years. After decades of ...

13 Des 20231h 30min

Stephanie Hoopes, Peter Kilde, Marc Perry, Dalitso Sulamoyo: "Poverty Blind" | Reality Roundtable #7

Stephanie Hoopes, Peter Kilde, Marc Perry, Dalitso Sulamoyo: "Poverty Blind" | Reality Roundtable #7

On this Reality Roundtable, Nate is joined by four professionals with decades of experience working with low-income communities, Stephanie Hoopes, Peter Kilde, Marc Perry, and Dalitso Sulamoyo to disc...

10 Des 20231h 22min

Populært innen Vitenskap

fastlegen
tingenes-tilstand
jss
rekommandert
tomprat-med-gunnar-tjomlid
rss-rekommandert
forskningno
sinnsyn
villmarksliv
rss-paradigmepodden
smart-forklart
tidlose-historier
diagnose
dekodet-2
grunnstoffene
fjellsportpodden
nevropodden
noen-har-snakket-sammen
vett-og-vitenskap-med-gaute-einevoll
rss-nysgjerrige-norge