If you *know* what the optimal option is, it would seem irrational to deliberately donate elsewhere (even to the 2nd or 3rd best options). So there's an important sense in which you should always prefer to do more good rather than less (all else equal), and not be satisfied with just any old "good enough" option.
If you *know* what the optimal option is, it would seem irrational to deliberately donate elsewhere (even to the 2nd or 3rd best options). So there's an important sense in which you should always prefer to do more good rather than less (all else equal), and not be satisfied with just any old "good enough" option.
But if you don't know what the best option is (as in real life, we never do), I don't think you should obsess over this or feel "paralyzed by guilt". If you make a good-faith effort to optimize, and do so in a reasonable fashion (not, like, making reckless criminal gambles or anything), then I don't think there is any cause for guilt. Blameworthiness has more to do with quality of will, or making a decent effort, than actual outcomes (which are outside of our control). I discuss this more in "Uncertain Optimizing and Opportunity Costs":
A separate issue (for when all else is *not* equal) is how much we should be willing to personally sacrifice in other to do more impartial good for others. I'm a satisficer about that question:
Thanks for the thoughtful reply, Richard. I'll take a gander at the links you've posted. And yeah, I agree that you should be an absolute optimizer if you were omniscient - I was mostly thinking that satisficing is a pretty sound practical strategy if you're not.
If you *know* what the optimal option is, it would seem irrational to deliberately donate elsewhere (even to the 2nd or 3rd best options). So there's an important sense in which you should always prefer to do more good rather than less (all else equal), and not be satisfied with just any old "good enough" option.
But if you don't know what the best option is (as in real life, we never do), I don't think you should obsess over this or feel "paralyzed by guilt". If you make a good-faith effort to optimize, and do so in a reasonable fashion (not, like, making reckless criminal gambles or anything), then I don't think there is any cause for guilt. Blameworthiness has more to do with quality of will, or making a decent effort, than actual outcomes (which are outside of our control). I discuss this more in "Uncertain Optimizing and Opportunity Costs":
https://www.goodthoughts.blog/p/uncertain-optimizing-and-opportunity
A separate issue (for when all else is *not* equal) is how much we should be willing to personally sacrifice in other to do more impartial good for others. I'm a satisficer about that question:
https://philpapers.org/rec/CHASBE-4
Thanks for the thoughtful reply, Richard. I'll take a gander at the links you've posted. And yeah, I agree that you should be an absolute optimizer if you were omniscient - I was mostly thinking that satisficing is a pretty sound practical strategy if you're not.