Discussion about this post

User's avatar
Rhapsodist's avatar

“So many Twitter critics confidently repeat some utterly conventional thought as though the mere fact of going against the conventional wisdom is evidence that EAs are nuts.”

If you replace the word “evidence” with “conclusive proof,” this accurately sums up literally every criticism of longtermism I’ve ever read.

Expand full comment
MorningLightMountain's avatar

I think that I can explain how someone can be hostile to EA ideas despite them being obviously true. It's not that they're hostile to the ideas themselves but rather to people consciously adopting them as goals. More generally I think that the EA criticisms you see are, low-decoupling style, mostly not criticisms of the ideas on their own merits but rather (very strong and universal) claims about the psychological and social effects of people believing the claims, which they think cause problems that are not unique to current EA institutions but basically intrinsic to any human attempt to implement EA ideas.

The claim is something like: the idea that you should seek to do cause impartial beneficent good is such a brain rot and so terribly corrosive in terms of what it does to human motivations that even though on paper it seems like there's no possible way that pursuing this idea could be worse than never thinking about doing it, in real life it just destroys you.

According to these critics, every time anyone tries to adopt this as a deliberate goal it's like picking up the One Ring, and you're near guaranteed to end up in ruinous failure because...? There are a bunch of reasons offered some of which contradict each other. One is that it smuggles in being okay with the status quo and not being okay with overthrowing modern civilization to produce something else. Another is that it sets you up to be easily manipulated because it sets a distant and broad goal such that you can justify anything with your biases and/or be tricked. Another is that it gives you a sense of superiority over everyone else around you and lets you take actions that are very distantly connected from doing good in the here and now, which means that you can always justify pretty much any bad thing you want to do as being part of the greater good. Another is that if you do believe in EA for real, it just corrodes your soul and stops you from having close human relationships and lets you neglect side constraints and instinctive warning signs that you're going wrong.

The claim isn't that any of these are intrinsic features of the ideas, but just that if you start believing strongly enough that you should do impartially beneficent good, because of the way human minds work, you'll get captured and possessed by this mindset and turn into a moral monster no matter what.

So on this view if you do care about impartial beneficent good you have to do something like trick yourself into thinking that's not what you really want and pursue some more narrow local project with more tight feedback loops. BUT of course you have to forget that this is why you did it, and forget the act of forgetting... doublethink style.

And obviously there's no real evidence given that this is how it necessarily goes other than pointing at a few high profile EA failures as if there aren't also high profile failures all over the place in more local and partial attempts to do good. (And as if the usually preferred alternative of starting an anti-capitalist revolution doesn't have every problem just listed to a far greater extreme)

It's essentially a conspiracy theory/genetic fallacy psychoanalysis argument. And this view also can't account for the good that EA has unequivocally done except to say something like "oh that all happened before you got fully corrupted/as an accidental contingent side effect on the way to full corruption".

And of course it's also diametrically opposite to the point you quote at the start of your post, i.e. EA ideas are both obvious tautologies and so extreme and strange that taking them seriously cores open your brain and makes you instantly turn into a moral monster.

Expand full comment
24 more comments...