There’s a general recipe that underlies the organ-harvesting case and similar standard “counterexamples” to utilitarianism:
(1) Imagine an action that violates important (utility-promoting) laws or norms, and—in real-world circumstances—would be disastrous in expectation.
(2) Add some fine print stipulating that, contrary to all expectation, the act is somehow guaranteed to turn out for the best.
(3) Note that our moral intuitions rebel against the usually-disastrous act. Checkmate, utilitarians!
This is a bad way to do moral philosophy. Our intuitions about real-world situations may draw upon implicit knowledge about what those situations are like, and you can’t necessarily expect our intuitions to update sufficiently based on fine print stipulating that actually it’s an alien situation nothing like what it seems. If you want a neat, tidy, philosophical thought experiment where all else is held equal—unlike in real life—then it’s best to avoid using real-life settings. Rather than asking us to assess an alien situation masquerading as a real-life one, it would be much clearer to just honestly describe the alien situation as such. Literally.
Consider Martian Harvest:
Martians, it turns out, are rational, intelligent gelatinous blobs whose sensory organs are incapable of detecting humans. They generally live happy and peaceful lives, except for practicing ritual sacrifice. Upon occasion, a roving mob will impound six of their fellow citizens and treat them as follows. Five have a portion of their “vital gel” removed and ritually burned. As a result, they die within a day, unless the sixth chooses to sacrifice himself—redistributing the entirety of his vital gel to the other five. Whoever is alive at the end of the day is released back into Martian society, and lives happily ever after.
You find yourself within the impounding facility (with a note on your smartwatch indicating that you will be automatically teleported back to Earth within an hour). Five devitalized Martians are wallowing in medical pods, while the sixth is studiously ignoring the ‘sacrifice’ button. You could press the button, and no-one would ever be the wiser. If you do, the Martian medical machinery will (quickly and painlessly) kill the sixth, and use his vital gel to save the other five.
Should you press the button?
This is a much more philosophically respectable and honest thought experiment. Believers in deontic constraints will naturally conclude that you should not press the button, as that would kill one unconsenting (Martian) person as a means to saving five others. Still, in this situation where it’s clear that all else truly is equal, I find it intuitively obvious that one ought to press the button, and expect that many others will agree. It’s not any sort of costly “bullet” to bite.
That is to say, Martian Harvest is not a “counterexample” to utilitarianism, but simply a useful test case for diagnosing whether you’re intuitively drawn to the utilitarian account of what fundamentally matters.
Now, given that Martian Harvest more transparently describes the structural situation that the familiar Transplant scenario aspires to model, and yet our intuitions rebel far more against killing in the Transplant case, it seems safe to conclude that extraneous elements of the real-world setting are distorting our intuitions about the latter. (And understandably enough: as even utilitarians will insist, real-world doctors really shouldn’t murder their patients! This is not a verdict that differentiates utilitarians from non-utilitarians, except in the fevered imaginations of their most rabid critics.)
That is, Transplant is, philosophically speaking, an objectively worse test case for differentiating moral theories. It’s an alien case masquerading as a real-life one, which builds in gratuitous confounds and causes completely unnecessary confusion. That’s bad philosophy, and anyone who appeals to Transplant as a “counterexample” to utilitarianism is making a straightforward philosophical mistake. Encourage them to consider a transparently alien case like Martian Harvest instead.
As written the Martian case actually has a wrinkle that may actually sneak in some extra game-theoretic moral intuitions: the sixth Martian is avoiding the lever, choosing to allow five of its fellows to die that it might live. So it’s morally culpable (if to an understandable degree!) in a way that “punishing” it feels less bad, perhaps especially to a certain kind of traditional deontologist, but also (as in the transplant case) game-theoretic considerations that may be considered by utilitarians. In the other direction there may be hesitancy in interfering with a (literally) alien cultural practice whose purposes you don’t understand.
Of course if you specify the sixth Martian has not awoken either the punishment intuition presumably disappears, and if you specify the ritual sacrifice came about for some bad reason (a cruel Martian emperor decreed it for its amusement, or whatever), then these might disappear, so this is all a quibble that doesn’t detract from your main point.
Richard,
For most of my life I've been a (partially conflicted) believer in natural rights, but lately I've been pushed more and more toward utilitarianism and am now teetering on the brink -- and this post of yours contributed greatly to bringing me to this point.
However, there are a couple topics about which I am having trouble embracing the utilitarian viewpoint, which I would love to see you write a post about:
1. Just Deserts: (This example is from Huemer) You have a tasty cookie that will produce harmless pleasure with no other effects. You can give it to either serial killer Ted Bundy, or the saintly Mother Teresa. Bundy enjoys cookies slightly more than Teresa. Should you therefore give it to Bundy?
I suppose utilitarians might say that you could give the cookie to Teresa to avoid incentivizing serial killing, or because other people might see you give the cookie to Bundy and derive dissatisfaction from their sense of justice being violated (even if their conception of justice is incorrect), but these responses would dodge the point -- most people have the intuition that giving the cookie to Ted Bundy is fundamentally wrong beyond any downstream consequences simply because Ted Bundy doesn't *deserve* the cookie.
I've heard of "desert-adjusted" utilitarianism (DAU) (https://utilitarianism.net/near-utilitarian-alternatives/#desert-adjusted-views), which seems to address the issue head-on. Do you think DAU is the correct framework?
2. Restitution: Consider Abe, Bob, and Cindy. Abe owns a bike. However, Bob would get more utility from the bike than Abe. Bob steals the bike from Abe (with no intention of using the bike to aid in committing more crimes). Cindy is wealthy and could buy Abe a new bike with minimal utility loss to herself.
Putting aside the important deterrent effects of having laws against stealing and the fact that stealing is usually wrong, utilitarianism would seem to call for letting Bob keep the stolen bike and having Cindy buy Bob a new bike. Yet, this strikes most people as unfair -- Bob stole the bike so he should be required to return the bike to Abe (or buy him a new one that is just as good).
While I can appreciate that in other alleged counter-examples such as Organ Harvester we cannot so easily set aside our intuitions about the broader implications and our status quo and other biases, I'm not confident that response would be satisfactory in this example. Or is it?
Would you say that property rights are just a social construct and so Abe in fact had no greater moral claim to the bike than Bob did? Would our intuitions or the morally justified resolution change if we stipulated that Bob first asked Abe politely for the bike and Abe refused and only then did Bob take it?
^ I would be eager to read a post of yours addressing these two topics!
Lastly I would also be interested in reading your thoughts on a utilitiarian legal framework -- is private ownership of the means of production justified, and if so to what extent? What should the law require with respect to redistributive justice? I am aware that many utilitarians embrace common sense moral norms in many cases (https://utilitarianism.net/utilitarianism-and-practical-ethics/#respecting-commonsense-moral-norms), but I need more detail!
Thank you.