Email or username:

Password:

Forgot your password?
David Gerard

You may have heard that the principals of the collapsed FTX crypto exchange were into Effective Altruism.

Let me explain what Effective Altruism is:

* Some charities are more effective than others, and you should donate to the more effective ones.

yeah, sounds obvious and sensible

* As a first-worlder, you are basically rich, even if you don’t feel like it, and you therefore have an ethical obligation to contribute to those who aren’t - almost certainly more than you do now.

this is pretty sound reasoning actually, I can get behind this

* Therefore, we can and should stack-rank every charitable initiative in the world according to an objective numerical scoring system,

wait what

* and clearly the most cost-effective initiative possible for all of humanity is donating to fight the prospect of unfriendly artificial intelligence,,

what the

* and oh look, we just happen to have a charity for that precise purpose right here! WHAT ARE THE ODDS,,,,

fuckin

4 comments
David Gerard

to be fair: the "EA" movement is three or four movements in a trenchcoat. some do good stuff, others are rationalist AI cultists.

EAs are desperately sincere, and they probably make more good things happen than would happen otherwise

but they need to get rid of the AI cultists

who are also the Longtermists, who literally care more about 10^54 hypothetical future human emulations running on computers than about real people who exist now, and who care more about the possible suffering of electrons than the real suffering caused by climate change or racism

and the FTX-Alameda crew were hardcore from the AI cultist wing of EA

it's systemic, because the AI cultists named it "Effective Altruism" and gathered the others into the trenchcoat, and still do a lot of the organisational slog, so it's hard to get free of them.

This FTX catastrophe offers a chance: so many EA initiatives got screwed over too.

how to infuriate an EA: point out that the Make A Wish Foundation is literally a more efficient use of charitable dollars than MIRI, the AI cultists' charity

take care not to let EAs ambit-claim the concept of charity, or the concept of measuring charity, which even the good ones have a nasty habit of trying

to be fair: the "EA" movement is three or four movements in a trenchcoat. some do good stuff, others are rationalist AI cultists.

EAs are desperately sincere, and they probably make more good things happen than would happen otherwise

but they need to get rid of the AI cultists

who are also the Longtermists, who literally care more about 10^54 hypothetical future human emulations running on computers than about real people who exist now, and who care more about the possible suffering of electrons...

Michael Busch

@davidgerard I had somehow missed that particular misunderstanding of everything by the effective altruists.

Although I mostly stopped talking to them after "that's not how Bayesian statistics work" and "no, we don't need to spread throughout the universe because of the asteroid impact hazard" conversations.

Go Up