6 Comments

I'm agnostic about whether you're right in this article, so my higher-order agnosticism about X risk agnosticism leads to me taking X risks seriously.

Expand full comment
Jun 8, 2023Liked by Richard Y Chappell

Existential risk is very important in my view. I am really unsure about anthropic arguments like the doomsday argument but they sound very possible, I’m just not sure which one is right. It seems like this weird type of thinking that’s very counter intuitive might be really important in evaluating x-risk.

Expand full comment
Oct 6, 2023·edited Oct 6, 2023Liked by Richard Y Chappell

Out of interest, what are your views on people who say that not only should you be agnostic about the probability of x-risk, but you should also be agnostic about whether human extinction would be a bad thing. Some of vegan friends (although not *only* the vegans) tend towards the view that given that factory farming is so bad, stopping it entirely may outweigh the costs of human extinction. When you add in possible future s-risks to human beings (and non-human animals) and the possibility that humans so far have had lives that aren't worth living, is agnosticism *so* irrational here?

Expand full comment
author

I think one should give *some* credence to that possibility. But not so much as to undermine the overall positive expectation of future civilization.

Why be relatively optimistic? One reason (iirc, Will mentions this in WWOTF) is that harms tend to be incidental rather than ends-in-themselves for us, so as humanity increases in power we could reasonably expect this to lead to better outcomes. Right now we might be at something like peak harmfulness: powerful enough to dominate other species and the overall environment, but not powerful enough to secure our goals without causing immense incidental harm. With advancing technology, we can more easily mitigate those incidental harms. (Cultured meat being an obvious example.)

Other reasons might depend on disputable value judgments. But I'm inclined to give relatively low weight to hedonic values relative to perfectionist ones: I have some sympathy (short of outright endorsement) for the sort of "Nietzschean" view discussed here: https://rychappell.substack.com/p/the-nietzschean-challenge-to-effective

So I'm inclined to think that flourishing civilization is ultimately more important than incidental harms, *even if* for some reason we're unable to eventually mitigate the latter. I'd still be *very* reluctant to endorse the extinction of intelligent life as the best way forward.

On future s-risks to humans (locked-in dystopia, or some such), I'm all in favour of steps to further reduce those risks *without* abandoning the hope of a positive future. But I don't see much reason to think they're *so* likely as to warrant abandoning hope and positively hoping for the void instead.

Expand full comment

(I don't endorse this view, I'm just interested in what you think of it).

Expand full comment
Jun 9, 2023Liked by Richard Y Chappell

I think x-risk is like factory farming, in that worrying about it is pretty overdetermined, and doesn't depend much on the details of your worldview or your ethical principles. Like, I don't know how I feel about the "astronomical potential" stuff, but that doesn't change my concern about x-risks much at all.

Expand full comment