Why You Are Either the Solution—or the Problem
You are on a glacier with two climbers. The first slips and falls into a crevasse. He might survive if you call for help, but you don’t, and he perishes. The second climber you actively push into the ravine, and he dies shortly afterward. Which weighs more heavily on your conscience?
Considering the options rationally, it’s obvious that both are equally reprehensible, resulting as they do in death for your companions. And yet something makes us rate the first option, the passive option, as less horrible. This feeling is called the omission bias. It crops up where both action and inaction lead to cruel consequences. In such cases, we tend to prefer inaction; its results seem more anodyne.
Suppose you are the head of the Federal Drug Administration. You must decide whether or not to approve a drug for the terminally ill. The pills can have fatal side effects: They kill 20 percent of patients on the spot, but save the lives of the other 80 percent within a short period of time. What do you decide?
Most would withhold approval. To them, waving through a drug that takes out every fifth person is a worse act than failing to administer the cure to the other 80 percent of patients. It is an absurd decision, and a perfect example of the omission bias. Suppose that you are aware of the bias and decide to approve the drug in the name of reason and decency. Bravo. But what happens when the first patient dies? A media storm ensues, and soon you find yourself out of a job. As a civil servant or politician, you would do well to take the ubiquitous omission bias seriously—and even foster it.
Case law shows how engrained such “moral distortion” is in our society. Active euthanasia, even if it is the explicit wish of the dying, is punishable by law, whereas deliberate refusal of lifesaving measures is legal (for example, following so-called DNR orders—do not resuscitate).
Such thinking also explains why parents feel it is perfectly acceptable not to vaccinate their children, even though it discernibly reduces the risk of catching the disease. Of course, there is also a very small risk of getting sick from the vaccine. Overall, however, vaccination makes sense. Vaccination protects not only the children, but society, too. A person who is immune to the disease will never infect others. Objectively, if non-vaccinated children ever contracted one of these sicknesses, we could accuse the parents of actively harming them. But this is exactly the point: Deliberate inaction somehow seems less grave than a comparable action—say, if the parents intentionally infected them.
The omission bias lies behind the following delusions: We wait until people shoot themselves in the foot rather than taking aim ourselves. Investors and business journalists are more lenient on companies that develop no new products than they are on those that produce bad ones, even though both roads lead to ruin. Sitting passively on a bunch of miserable shares feels better than actively buying bad ones. Building no emission filter into a coal plant feels superior to removing one for cost reasons. Failing to insulate your house is more acceptable than burning the spared fuel for your own amusement. Neglecting to declare income tax is less immoral than faking tax documents, even though the state loses out either way.
In the previous chapter, we met the action bias. Is it the opposite of the omission bias? Not quite. The action bias causes us to offset a lack of clarity with futile hyperactivity and comes into play when a situation is fuzzy, muddy, or contradictory. The omission bias, on the other hand, usually abounds where the situation is intelligible: A future misfortune might be averted with direct action, but this insight doesn’t motivate us as much as it should.
The omission bias is very difficult to detect—after all, action is more noticeable than inaction. In the 1960s student movements coined a punchy slogan to condemn it: “If you’re not part of the solution, you’re part of the problem.”