[Linkpost] “The best cause will disappoint you: An intro to the optimisers curse” by titotal

[Linkpost] “The best cause will disappoint you: An intro to the optimisers curse” by titotal

This is a link post.

I would like to thank David Thorstadt for looking over this. If you spot a factual error in this article please message me. The code used to generate the graphs in the article is available to view here.

Introduction

Say you are an organiser, tasked with achieving the best result on some metric, such as “trash picked up”, “GDP per capita”, or “lives saved by an effective charity”. There are several possible options of interventions you can take to try and achieve this. How do you choose between them?

The obvious thing to do is look at each intervention in turn and make your best, unbiased estimate of how each intervention will perform on your metric, and pick the one that performs the best:

Image taken from here

Having done this ranking, you declare the top ranking program to be the best intervention and invest in it, expecting that that your top estimate will be the result that you get. This whole procedure is totally normal, and people all around the world, including people in the effective altruist community, do it all the time.

In actuality, this procedure is not correct. The optimisers curse is [...]

---

Outline:

(00:26) Introduction

(02:17) The optimisers curse explained simply

(04:42) Introducing a toy model

(08:45) Introducing speculative interventions

(12:15) A simple bayesian correction

(18:47) Obstacles to simple optimizer curse solutions.

(22:08) How Givewell has reacted to the optimiser curse

(25:18) Conclusion

---

First published:
February 11th, 2026

Source:
https://forum.effectivealtruism.org/posts/q2TfTirvspCTH2vbZ/the-best-cause-will-disappoint-you-an-intro-to-the

Linkpost URL:
https://open.substack.com/pub/titotal/p/the-best-cause-will-disappoint-you?r=1e0is3&utm_campaign=post&utm_medium=web&showWelcomeOnShare=true

---

Narrated by TYPE III AUDIO.

---

Images from the article:

Apple Podcasts and Spotify do not show images in the episode description. Try Pocket Casts, or another podcast app.

Avsnitt(250)

[Linkpost] “Starfish” by Aaron Gertler 🔸

[Linkpost] “Starfish” by Aaron Gertler 🔸

This is a link post. By Alexander Wales Thousands of starfish had washed up on the beach, and a little girl was diligently throwing them back into the water, one at a time. A man came up to the girl a...

1 Maj 4min

“Time Sensitive Urgent Animal Welfare Action” by Bentham’s Bulldog

“Time Sensitive Urgent Animal Welfare Action” by Bentham’s Bulldog

The EATS act—now called the save our bacon act—would make it illegal for states to pass animal welfare laws that apply to products produced out of state. This would gut most state level animal protect...

29 Apr 1min

“Forecasting is Way Overrated, and We Should Stop Funding It” by Marcus Abramovitch 🔸

“Forecasting is Way Overrated, and We Should Stop Funding It” by Marcus Abramovitch 🔸

Summary EA and rationalists got enamoured with forecasting and prediction markets and made them part of the culture, but this hasn’t proven very useful, yet it continues to receive substantial EA fun...

26 Apr 8min

“My lover, effective altruism” by Natalie_Cargill

“My lover, effective altruism” by Natalie_Cargill

Crossposted from Substack. This post is part of a 30-posts-in-30-days ordeal at Inkhaven. All suboptimalities are the result of that. This is part 2, here is part 1 in my EA mini series! On my way to ...

26 Apr 8min

“A Database of Near-Term Interventions for Wild Animals” by Bob Fischer

“A Database of Near-Term Interventions for Wild Animals” by Bob Fischer

The Animal Welfare Department (AWD) at Rethink Priorities supports high-impact strategies to help animals, especially where suffering is vast and largely neglected. Therefore, one of our focus areas i...

24 Apr 17min

“The AI people have been right a lot” by Dylan Matthews

“The AI people have been right a lot” by Dylan Matthews

This post was crossposted from Dylan Matthew's blog by the EA Forum team. The author may not see or reply to comments. Subtitle: Try to keep an open mind as the world gets increasingly wild.The crowd ...

20 Apr 10min

[Linkpost] “The Anthropic IPO Is Coming. We Aren’t Ready for It.” by Sophie Kim

[Linkpost] “The Anthropic IPO Is Coming. We Aren’t Ready for It.” by Sophie Kim

This is a link post. More money is coming than AI safety has ever seen. The capacity to deploy it doesn't exist yet.Image Source: Fortune.com This week, Anthropic announced Claude Mythos Preview– a mo...

17 Apr 15min

“AI Safety’s Biggest Talent Gap Isn’t Researchers. It’s Generalists.” by Topaz, Agustín Covarrubias 🔸, Alexandra Bates, Parv Mahajan, Kairos

“AI Safety’s Biggest Talent Gap Isn’t Researchers. It’s Generalists.” by Topaz, Agustín Covarrubias 🔸, Alexandra Bates, Parv Mahajan, Kairos

This post was cross posted to LessWrong TL;DR: One of the largest talent gaps in AI safety is competent generalists: program managers, fieldbuilders, operators, org leaders, chiefs of staff, founders....

16 Apr 13min

Populärt inom Samhälle & Kultur

podme-dokumentar
gynning-berg
aftonbladet-krim
en-mork-historia
p3-dokumentar
creepypodden-med-jack-werner
skaringer-nessvold
svenska-fall
spar
aftonbladet-daily
killradet
hor-har
kod-katastrof
mardromsgasten
rss-brottsutredarna
flashback-forever
vad-blir-det-for-mord
rysarpodden
rss-mer-an-bara-morsa
larm-vi-minns