13 Comments
User's avatar
тна Return to thread
Bentham's Bulldog's avatar

I bite the bullet! I once found this really unintuitive, but reflecting on it, it seemed like irrational risk aversion, and it no longer seems unintuitive.

Expand full comment
Richard Y Chappell's avatar

Yeah, that may be the best option at the end of the day. But (as flagged in the post) I don't think it's risk aversion. If I try to directly evaluate the normal life (or world) vs doubled-up life (world), the doubled-up one just doesn't strike me as really seeming twice as good.

But it's hard to reconcile this intuition of diminishing marginal value with the competing intuition that arbitrarily large increases in basic goods are never trivial. Damn incoherent intuitions!

Expand full comment
Bentham's Bulldog's avatar

But that's also how the organ harvesting case seems to the deontologists. Ethics often conflicts with our intuitions, and our intuitions thus often need to be revised. This is especially so when it has all the hallmarks of a bad intuition; it's plausible explained by status quo bias (isn't it strange that the most valuable lives seem to be the life we plausible have ahead of us), irrational risk aversion (we know that the human brain is bad at adding up different numbers -- it fails to shut up and multiply), and it's seemingly impossible to come up with a compelling, justified philosophical position that allows us to hold on to those intuitions.

One other intuition to correct the status quo bias; suppose that your life was a rerun. You'd already lived, died, then been reborn. That wouldn't seem to diminish the value of your life. It's only when the extra life seems like extra, when status quo bias is in full force, that a 1/2 chance of two lives seems less valuable.

Expand full comment
Richard Y Chappell's avatar

Diminishing marginal value of basic goods strikes me as a view with principled appeal. It has some costs, but so does the total view's fanaticism (e.g. sacrificing utopia for a mere 1/2^trillions chance of uber-utopia). I don't think super-high confidence is warranted either way here.

Expand full comment
Bentham's Bulldog's avatar

I agree that both views have costs; one's just seem far greater than the other.

Expand full comment
User's avatar
Comment deleted
Aug 13, 2022
Comment deleted
Expand full comment
Bentham's Bulldog's avatar

I think that this is a case where there is no most rational action to take. There are other cases like this; suppose you could increase the utility in the world by any number. For any number you choose, you could have chosen a better one, so there's no right action, given that every action has a better possible action. So in this case the expected utility function is such that it increases indefinitely the more times you flip -- but at infinity flips, the answer is zero. Thus, there is no best action -- but flipping infinite times is clearly wrong as it guarantees no value.

Expand full comment
User's avatar
Comment deleted
Aug 14, 2022Edited
Comment deleted
Expand full comment
Bentham's Bulldog's avatar

I'd give a precisely identical preference ordering 3>2>1>0. The answer here is similarly as large a number as possible. We know humans are terrible at envisioning large numbers. I don't know if I would play 100 trillion rounds, but I think I ought to.

Expand full comment
User's avatar
Comment deleted
Aug 14, 2022
Comment deleted
Expand full comment
Bentham's Bulldog's avatar

I don't know if I could either psychologically. But that just seems like an error on our parts.

Expand full comment
User's avatar
Comment deleted
Aug 14, 2022
Comment deleted
Expand full comment