Yes, pretty similar! One way to put pressure on maximizing is to consider lower-stakes cases. Consider the smallest possible rational error. Maybe you pass by a dime, initially thinking it isn't worth the bother of picking up, and then later decide that it probably would have been (slightly) worth the (tiny) bother after all. It sounds a…
Yes, pretty similar! One way to put pressure on maximizing is to consider lower-stakes cases. Consider the smallest possible rational error. Maybe you pass by a dime, initially thinking it isn't worth the bother of picking up, and then later decide that it probably would have been (slightly) worth the (tiny) bother after all. It sounds a bit pathological to insist that you *violated an obligation* here; it's just not that big of a deal.
Insofar as impermissibility is supposed to be a "big deal", and sheer sub-optimality isn't necessarily a big deal, I think if I wasn't a sentimentalist about permissibility I would sooner just stick to scalar consequentialism (and adopt an error theory about permissibility as a concept) rather than positing maximizing "demands" that go above and beyond simply identifying what we have *most reason* to do.
Thx. I wonder if the rationalist has an advantage when it comes to suboptimal altruism? Imagine someone voluntarily choosing to make a suboptimal sacrifice for someone else (in other words, they sacrifice slightly more than the other person gains). Intuitively it seems like, within certain limits, this would be fully justified. I don't think they made any "all-things-considered error". The rationalist can explain this fact by saying that this person did what they had most reason to do given their own subjective weightings of their moral vs. prudential reasons (assuming these weightings qualify as "acceptable").
Oh, that's funny, I would have thought the very opposite! To me it seems completely obvious that disproportionate self-sacrifice is unwarranted (or contrary to reason), but we may permit it on grounds that others don't have standing to criticize you for it, or something like that. So I think the sentimentalist offers the better story here. I guess that suggests it's a nice test case for seeing which way one's own intuitions go!
I think of myself as pretty straight Norcross scalar, but whether I'd be an "error theorist" is really a psychological/linguistic claim about the people around me, isn't it? Except for those who actually believe in divine punishment/reward, karma, etc, it mostly seems like I'd be a noncognitivist about obligation/permissibility; these sentences seem a lot like non-propositional orders or allowances.
Yes, pretty similar! One way to put pressure on maximizing is to consider lower-stakes cases. Consider the smallest possible rational error. Maybe you pass by a dime, initially thinking it isn't worth the bother of picking up, and then later decide that it probably would have been (slightly) worth the (tiny) bother after all. It sounds a bit pathological to insist that you *violated an obligation* here; it's just not that big of a deal.
Insofar as impermissibility is supposed to be a "big deal", and sheer sub-optimality isn't necessarily a big deal, I think if I wasn't a sentimentalist about permissibility I would sooner just stick to scalar consequentialism (and adopt an error theory about permissibility as a concept) rather than positing maximizing "demands" that go above and beyond simply identifying what we have *most reason* to do.
Thx. I wonder if the rationalist has an advantage when it comes to suboptimal altruism? Imagine someone voluntarily choosing to make a suboptimal sacrifice for someone else (in other words, they sacrifice slightly more than the other person gains). Intuitively it seems like, within certain limits, this would be fully justified. I don't think they made any "all-things-considered error". The rationalist can explain this fact by saying that this person did what they had most reason to do given their own subjective weightings of their moral vs. prudential reasons (assuming these weightings qualify as "acceptable").
Oh, that's funny, I would have thought the very opposite! To me it seems completely obvious that disproportionate self-sacrifice is unwarranted (or contrary to reason), but we may permit it on grounds that others don't have standing to criticize you for it, or something like that. So I think the sentimentalist offers the better story here. I guess that suggests it's a nice test case for seeing which way one's own intuitions go!
I think of myself as pretty straight Norcross scalar, but whether I'd be an "error theorist" is really a psychological/linguistic claim about the people around me, isn't it? Except for those who actually believe in divine punishment/reward, karma, etc, it mostly seems like I'd be a noncognitivist about obligation/permissibility; these sentences seem a lot like non-propositional orders or allowances.