I have doubts about the asteroid example working against mugging.
"(1) The asteroid case involved objective chances, rather than made-up subjective credences.
In what real life situations do we have access to "objective chances?" Never, we don't observe some platonic realm of chances. We might think that some subjective credences are bette…
I have doubts about the asteroid example working against mugging.
"(1) The asteroid case involved objective chances, rather than made-up subjective credences.
In what real life situations do we have access to "objective chances?" Never, we don't observe some platonic realm of chances. We might think that some subjective credences are better grounded than others, but in the real world that's all we have.
The whole concept of EV is kind of subjective- we only observe what happens, not parallel worlds or whatever.
You can take "objective chances" to just be "sufficiently well-grounded subjective chances". E.g., assigning a 50% chance to a coin flip is very different (and more robustly justified) than assigning some arbitrary 1/Y chance to the wild claims of Pascal's mugger.
Real-life cases that are parallel in structure to the asteroid case come up all the time when thinking about "collective harm"-type cases. Voting in a close election is another example: in a population of X voters, the chance of placing a decisive vote is typically on the order of 1/X, and affects at least X people. So whenever the total benefits of the better candidate winning are sufficient to justify the time costs of up to X voters, it will generally be worth it for any individual to vote (for the better candidate), no matter how high X is, and hence no matter how small a chance (1/X) their vote has of "making a difference".
My objection to this type of reasoning is it seems very motivated- like we're just looking for any principle to reject the mugger (and the wager). But why does this reasoning work?
Imagine P(A) is very uncertain and very low so we pretend it's zero. But obviously it's not zero in the Bayesian sense because that would mean we could never be more confident in A regardless of the evidence. So there's this split between the probability between what we believe and what we use for expected value calculations. That seems really bad- we're using numbers we know are fake for our EV calculations.
It also seems strange to have a binary consideration of well-grounded or not well-grounded probabilities for EV. Finally, what do we about highly speculative but high probabilities- do we round up to 100%? Imagine someone said I'm very confident that this claim is true with P(A)=99%+ but I haven't thought about it much so it could change a lot. This implies that the probability of not A is highly speculative and near zero... what do we do?
"In the mugging, i think we have a better grip on the facts about rational decision than on priors for how likely it is the mugger can benefit arbitrary large groups of people. But there's nothing wrong with working backwards to figure out sensible priors on the basis of one's firmer grip on facts about the utility of outcomes and facts about the rationality of decisions."
I don't think that anything general can be said about "highly speculative but high probabilities". It depends on the content of the belief, and whether the "speculative" nature of it suggests that the ideal credence is likely to be significantly higher, lower, or what. I'm not suggesting that you can ignore *all* speculative low probabilities. I'm merely suggesting that being speculative and extremely low are *necessary conditions* for such a move. But the final verdict will always depend on the details.
Here's another reason why I think the poorly grounded but low probabilities can (at least generally be ignored).
Imagine I think the probability is 10^-10 but I'm highly uncertain. There should be some lower probability that I'm fairly certain is equal or higher than the real probability. Maybe it's 10^-20 - then I could this lower value for EV calculations. This seems much better than rounding to literal zero.
Suppose the mugger claims that they can realize *any* amount of value that they choose. What probability should you assign to the truth of this claim? I suggest it should be closer to zero than to any other number you can express.
I would probably assign the same value I do to other very implausible religious claims.
Here's an example of rounding down to zero, leading to bad EV results. Imagine your eccentric uncle dies. He gives a sealed box to John and to a trillion other people. He told you that he flipped a coin- if it was heads, he put all his 1 million dollar fortune in John's box. If tails, he put it in someone else's box. You assume that the probability it's in John's box is 0.5 and the probability for each other box is 1 in 2 trillion. If it was tails, you're not sure how he chose what box to put it in. If you could talk to his widow you might gain valuable information- perhaps he put in the box of neighbor. The probability for each box other than John's is low and very uncertain. You round down the probability for each other box to zero. Since probabilities add up to 1, you're confident that John's box has a million dollars. You offer to but it from him for $950K. But that's absurd.
Yes, you obviously shouldn't round down in that sort of case. Again, I was identifying *necessary* not *sufficient* conditions for rounding down. In any case where you obviously shouldn't round down, my response will simply be: "I agree! I don't endorse rounding down in that case."
What I'm doing in the OP is instead providing guidance for when you SHOULDN'T round down. It does not follow from this that in all other cases you SHOULD. Rather, other cases are simply left open, so far as the argument of this post is concerned.
I have doubts about the asteroid example working against mugging.
"(1) The asteroid case involved objective chances, rather than made-up subjective credences.
In what real life situations do we have access to "objective chances?" Never, we don't observe some platonic realm of chances. We might think that some subjective credences are better grounded than others, but in the real world that's all we have.
The whole concept of EV is kind of subjective- we only observe what happens, not parallel worlds or whatever.
You can take "objective chances" to just be "sufficiently well-grounded subjective chances". E.g., assigning a 50% chance to a coin flip is very different (and more robustly justified) than assigning some arbitrary 1/Y chance to the wild claims of Pascal's mugger.
Real-life cases that are parallel in structure to the asteroid case come up all the time when thinking about "collective harm"-type cases. Voting in a close election is another example: in a population of X voters, the chance of placing a decisive vote is typically on the order of 1/X, and affects at least X people. So whenever the total benefits of the better candidate winning are sufficient to justify the time costs of up to X voters, it will generally be worth it for any individual to vote (for the better candidate), no matter how high X is, and hence no matter how small a chance (1/X) their vote has of "making a difference".
My objection to this type of reasoning is it seems very motivated- like we're just looking for any principle to reject the mugger (and the wager). But why does this reasoning work?
Imagine P(A) is very uncertain and very low so we pretend it's zero. But obviously it's not zero in the Bayesian sense because that would mean we could never be more confident in A regardless of the evidence. So there's this split between the probability between what we believe and what we use for expected value calculations. That seems really bad- we're using numbers we know are fake for our EV calculations.
It also seems strange to have a binary consideration of well-grounded or not well-grounded probabilities for EV. Finally, what do we about highly speculative but high probabilities- do we round up to 100%? Imagine someone said I'm very confident that this claim is true with P(A)=99%+ but I haven't thought about it much so it could change a lot. This implies that the probability of not A is highly speculative and near zero... what do we do?
Reflective equilibrium! I like Dan G.'s comment on this, from the public fb thread: https://www.facebook.com/richard.chappell/posts/pfbid02GX3CwyAsEhqDtzZbKdaeqz9rjYoq7JZy6uftrQnGMkv1854m7STj5c43Bwq8CR8El?comment_id=1047513809765705
"In the mugging, i think we have a better grip on the facts about rational decision than on priors for how likely it is the mugger can benefit arbitrary large groups of people. But there's nothing wrong with working backwards to figure out sensible priors on the basis of one's firmer grip on facts about the utility of outcomes and facts about the rationality of decisions."
I don't think that anything general can be said about "highly speculative but high probabilities". It depends on the content of the belief, and whether the "speculative" nature of it suggests that the ideal credence is likely to be significantly higher, lower, or what. I'm not suggesting that you can ignore *all* speculative low probabilities. I'm merely suggesting that being speculative and extremely low are *necessary conditions* for such a move. But the final verdict will always depend on the details.
Here's another reason why I think the poorly grounded but low probabilities can (at least generally be ignored).
Imagine I think the probability is 10^-10 but I'm highly uncertain. There should be some lower probability that I'm fairly certain is equal or higher than the real probability. Maybe it's 10^-20 - then I could this lower value for EV calculations. This seems much better than rounding to literal zero.
Suppose the mugger claims that they can realize *any* amount of value that they choose. What probability should you assign to the truth of this claim? I suggest it should be closer to zero than to any other number you can express.
I would probably assign the same value I do to other very implausible religious claims.
Here's an example of rounding down to zero, leading to bad EV results. Imagine your eccentric uncle dies. He gives a sealed box to John and to a trillion other people. He told you that he flipped a coin- if it was heads, he put all his 1 million dollar fortune in John's box. If tails, he put it in someone else's box. You assume that the probability it's in John's box is 0.5 and the probability for each other box is 1 in 2 trillion. If it was tails, you're not sure how he chose what box to put it in. If you could talk to his widow you might gain valuable information- perhaps he put in the box of neighbor. The probability for each box other than John's is low and very uncertain. You round down the probability for each other box to zero. Since probabilities add up to 1, you're confident that John's box has a million dollars. You offer to but it from him for $950K. But that's absurd.
Yes, you obviously shouldn't round down in that sort of case. Again, I was identifying *necessary* not *sufficient* conditions for rounding down. In any case where you obviously shouldn't round down, my response will simply be: "I agree! I don't endorse rounding down in that case."
What I'm doing in the OP is instead providing guidance for when you SHOULDN'T round down. It does not follow from this that in all other cases you SHOULD. Rather, other cases are simply left open, so far as the argument of this post is concerned.