Email or username:

Password:

Forgot your password?
Top-level
David Gerard

to be fair: the "EA" movement is three or four movements in a trenchcoat. some do good stuff, others are rationalist AI cultists.

EAs are desperately sincere, and they probably make more good things happen than would happen otherwise

but they need to get rid of the AI cultists

who are also the Longtermists, who literally care more about 10^54 hypothetical future human emulations running on computers than about real people who exist now, and who care more about the possible suffering of electrons than the real suffering caused by climate change or racism

and the FTX-Alameda crew were hardcore from the AI cultist wing of EA

it's systemic, because the AI cultists named it "Effective Altruism" and gathered the others into the trenchcoat, and still do a lot of the organisational slog, so it's hard to get free of them.

This FTX catastrophe offers a chance: so many EA initiatives got screwed over too.

how to infuriate an EA: point out that the Make A Wish Foundation is literally a more efficient use of charitable dollars than MIRI, the AI cultists' charity

take care not to let EAs ambit-claim the concept of charity, or the concept of measuring charity, which even the good ones have a nasty habit of trying

3 comments
Michael Busch

@davidgerard I had somehow missed that particular misunderstanding of everything by the effective altruists.

Although I mostly stopped talking to them after "that's not how Bayesian statistics work" and "no, we don't need to spread throughout the universe because of the asteroid impact hazard" conversations.

Go Up