20 Comments
Aug 27Liked by Richard Y Chappell

“ Crary (@38:21) objected to “earning to give” on the purely rhetorical grounds that it positions rich people as “saviors” of the poor, and “situates the global poor in something like the position of new colonial subjects.” I found this so slimy. ”

I find this type of reasoning just so bad. I share your disgust. Is there a good word for this type of argument? It’s kind of like Scott Alexander’s non-central fallacy but even worse. Argument by vague analogy, lacking all the relevant moral characteristics.

Expand full comment
Aug 27Liked by Richard Y Chappell

We need a name for this, as it seems to be responsible for around 98% of bad left-wing arguments by activists, and a lot of the bad right-wing arguments too.

Expand full comment
author
Aug 27·edited Aug 27Author

"Guilt by rhetorical association"? Maybe someone can suggest a catchier name. I agree it's maddeningly common.

[Edited to remove 'arbitrary' from the suggested phrase. Brief is better.]

Expand full comment

Peter Singer--the quintessential real-world effective altruist--isn't a real EA because he doesn't conform to Crary's ridiculous strawmen. What a joke!

Expand full comment

> Crary’s [...] observation that EA doesn’t tend to fund the social justice advocacy of her political allies.

I think Crary's objection boils down to "Oh no! Those EAs are helping people, and being effective at it. This means that they'll get some status that I'd rather went to me and my friends."

I realise that's a maximally uncharitable take on Crary. But I also think it is the most parsimonious explanation.

Expand full comment

Devastating!

Expand full comment
Sep 4Liked by Richard Y Chappell

Your next post after this one has a paywalled section, which is completely reasonable. But unfortunately, it seems that footnotes by default are below the paywall, even if the position in the text it corresponds to is above the paywall. I was worried that Substack had changed the footnote feature, but the paywall seems to be the explanation why I couldn’t see the footnotes.

Expand full comment
author
Sep 4·edited Sep 4Author

Oh, that's annoying! Thanks for letting me know. I'll try contacting substack support to see if there's any way to change that.

Expand full comment
author

In the meantime, I've screenshotted the most substantive ones here:

https://substack.com/profile/32790987-richard-y-chappell/note/c-67844544

Expand full comment

‘It positions rich people as “saviours” of poor people’

Help is help! I don’t believe anyone in a hard situation would reject help just because the person giving it is wealthy.. very strange reasoning indeed.

Expand full comment
Aug 30·edited Aug 30

Sorry, I haven't had time to cite examples, but I don't really like your tone in some of your writing re social justice type professors. It's seems kind of mean, and it doesn't seem like you're really trying to understand their perspective. That's a comment on a number of things you've written, not just this article. I'm not gonna elaborate right now (maybe later), but for now just take or leave it.

Expand full comment
author

Ok, I appreciate the feedback. If you do end up expanding upon your concern, I'd be curious about the extent to which this is just a matter of my responding very critically to the (IMO unreasonable on their part) hostility that's very publicly promoted by Crary, Wenar, etc., or in what cases you think my response is actually unwarranted.

(I assume that it's warranted to be hostile towards those who unreasonably initiate hostilities. And I'm sure you're aware that social justice ideology is both (i) overwhelmingly dominant in academia, and (ii) not exactly known for its understanding or openness to diverging viewpoints.)

Expand full comment
Aug 30·edited Aug 30Liked by Richard Y Chappell

I really don't understand the conflict between EA and leftism/social justice. It seems to me EA fits nicely with leftist ideals.

I think it's sick we have a system that produces billionaires with so much money they build space rockets for fun along with millions of children suffering from something as easy to fix as a vitamin A deficiency. But I don't see any tension between 1. Wanting that system to change and fighting to change it and 2. thinking, in the meantime, it's good to donate to alleviate suffering caused by that system.

I also see no conflict between wanting democracy extended to the workplace, or wanting massive reforms to criminal justice system, etc. and giving to orgs that distribute anti-malaria nets.

All of the hostility is very disappointing, leftists should be allies to EA.

Expand full comment
author

Yes, I very much agree. As a result, basically the only way I can make sense of it is as a kind of group politics / perceived status threat. E.g., Crary seems very upset that many of the best and most idealistic students on campus are now getting excited about EA and listening to the likes of Will MacAskill rather than just deferring to her & her friends. That doesn't seem like it should be hugely bothersome if one thinks about it in terms of "Are these students going to be doing good things as a result?" But if you attribute less high-minded motivations to the critics, it becomes easier to understand their behavior. Humans are social animals, after all.

Expand full comment
Sep 6Liked by Richard Y Chappell

I think the main point of conflict is that standard left-wing movements worry that their preferred concerns and solutions won't be very legible to EA--think of something like Communist revolution being analyzed in the classic EA way--and so the worry is the two movements will compete for people with broadly similar worldviews but in a way that draws people away from the left's preferred viewpoint.

Charitably, the worry is analytical: EA isn't equipped to come to the "correct" solution, uncharitably it's just worry over being outcompeted as a movement, and realistically it's a bit of both.

Expand full comment
author

I would be a lot more sympathetic to the charitable interpretation here if those critics would offer some sort of argument/evidence that their preferred views *really are* the "correct" ones (i.e., a first-order argument with which EA could then engage), rather than just *presupposing* this or treating it as self-evident. If the idea is that their view is correct but one can only know this through direct revelation, not reason, then that seems a real problem! On the other hand, if they really do have good reasons then they should try explicitly sharing them, rather than just obliquely complaining that EA is not equipped to recognize their superiority.

Expand full comment
Sep 9Liked by Richard Y Chappell

It seems to me that one of the things that's going on in lots of leftist/hardcore progressive thought is a shared view that the status quo is really deeply unacceptable--inhumane--to the point where trying facially plausible alternatives that don't have evidence in their favor is a better move than letting things continue as-is.

Think of how one might try to give a defense of the French Revolution in terms a bit more consequentialist than Payne's defense, and with reasonable intellectual humility. You might say "Look, in the present era there is no robust empirical or theoretical understanding of economics or sociology... but come on, a more egalitarian society has to be a better bet than dehumanizing serfdom!"

I think that's where a lot of people are about "global capitalism."

Expand full comment
Sep 10Liked by Richard Y Chappell

Oh sure, and I'm more comfortable extending that charity to critics who aren't professional philosophers, who I think you're more concerned with...I have in mind the average social justice-minded person who otherwise isn't thinking too deeply about any of this.

Expand full comment