Share this comment
Reflective equilibrium! I like Dan G.'s comment on this, from the public fb thread: facebook.com/richard.ch…
"In the mugging, i think we have a better grip on the facts about rational decision than on priors for how likely it is the mugger can benefit arbitrary large groups of people. But there's nothing wrong with working backwards to fi…
© 2025 Richard Y Chappell
Substack is the home for great culture
Reflective equilibrium! I like Dan G.'s comment on this, from the public fb thread: https://www.facebook.com/richard.chappell/posts/pfbid02GX3CwyAsEhqDtzZbKdaeqz9rjYoq7JZy6uftrQnGMkv1854m7STj5c43Bwq8CR8El?comment_id=1047513809765705
"In the mugging, i think we have a better grip on the facts about rational decision than on priors for how likely it is the mugger can benefit arbitrary large groups of people. But there's nothing wrong with working backwards to figure out sensible priors on the basis of one's firmer grip on facts about the utility of outcomes and facts about the rationality of decisions."
I don't think that anything general can be said about "highly speculative but high probabilities". It depends on the content of the belief, and whether the "speculative" nature of it suggests that the ideal credence is likely to be significantly higher, lower, or what. I'm not suggesting that you can ignore *all* speculative low probabilities. I'm merely suggesting that being speculative and extremely low are *necessary conditions* for such a move. But the final verdict will always depend on the details.
Here's another reason why I think the poorly grounded but low probabilities can (at least generally be ignored).
Imagine I think the probability is 10^-10 but I'm highly uncertain. There should be some lower probability that I'm fairly certain is equal or higher than the real probability. Maybe it's 10^-20 - then I could this lower value for EV calculations. This seems much better than rounding to literal zero.
Suppose the mugger claims that they can realize *any* amount of value that they choose. What probability should you assign to the truth of this claim? I suggest it should be closer to zero than to any other number you can express.
I would probably assign the same value I do to other very implausible religious claims.
Here's an example of rounding down to zero, leading to bad EV results. Imagine your eccentric uncle dies. He gives a sealed box to John and to a trillion other people. He told you that he flipped a coin- if it was heads, he put all his 1 million dollar fortune in John's box. If tails, he put it in someone else's box. You assume that the probability it's in John's box is 0.5 and the probability for each other box is 1 in 2 trillion. If it was tails, you're not sure how he chose what box to put it in. If you could talk to his widow you might gain valuable information- perhaps he put in the box of neighbor. The probability for each box other than John's is low and very uncertain. You round down the probability for each other box to zero. Since probabilities add up to 1, you're confident that John's box has a million dollars. You offer to but it from him for $950K. But that's absurd.
Yes, you obviously shouldn't round down in that sort of case. Again, I was identifying *necessary* not *sufficient* conditions for rounding down. In any case where you obviously shouldn't round down, my response will simply be: "I agree! I don't endorse rounding down in that case."
What I'm doing in the OP is instead providing guidance for when you SHOULDN'T round down. It does not follow from this that in all other cases you SHOULD. Rather, other cases are simply left open, so far as the argument of this post is concerned.