02.03.2013 Views

Thinking and Deciding

Thinking and Deciding

Thinking and Deciding

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

506 RISK<br />

Government agencies often incorporate this attitude into their policies. When the<br />

U.S. government regulates toxic substances, it tries first to establish the frequency<br />

of effects (poisoning, death, <strong>and</strong> other health effects) resulting from each chemical.<br />

The frequency is equivalent to the probability for the average person. Each estimate<br />

has some uncertainty around it, <strong>and</strong> this is represented as a 95% confidence interval.<br />

For example, the best guess might be “ten cases per year” but the interval might be<br />

“a 95% chance that the number of cases will fall between five <strong>and</strong> twenty.” Government<br />

agencies typically use the upper bound of the 95% confidence interval as their<br />

estimate for policy purposes (Zeckhauser <strong>and</strong> Viscusi, 1990). In some cases, this<br />

policy leads to expensive cleanups that would not be done if the decision were based<br />

on the expected benefit (Viscusi, Hamilton, <strong>and</strong> Dockins, 1997).<br />

Such responses to ambiguity can result in too many resources being spent on reducing<br />

ambiguous risks, when the same resources could do more good elsewhere.<br />

Suppose two risks have ten expected fatalities per year. For risk A (like auto accidents),<br />

we have had extensive experience <strong>and</strong> we know that this figure is accurate.<br />

For risk B, the best guess is eight, but the 95% confidence interval ranges as high<br />

as sixteen. If we think it is equally costly to reduce each risk by 50%, whatever its<br />

level, <strong>and</strong> if we must allocate funds between these two risks, we will allocate all the<br />

funds to B. If we go according to the best guess, we will allocate all the funds to A.<br />

If we allocate funds according to the upper bound, then we will — on the average,<br />

across many decisions of this type — do a little less well than we could do. On the<br />

average, we will cut the total fatality rate by four instead of five.<br />

This argument depends on one crucial assumption. This is that the “best guess”<br />

is unbiased for unknown risks. If the best guess for unknown risks is systematically<br />

too low, then the strategy of using a safety factor is normatively correct. (Of course,<br />

it could be that the bias goes the other way.) To my knowledge, this assumption has<br />

not been tested.<br />

Catastrophic versus individual<br />

Slovic et al. (1984) found that people are more frightened of risks with potentially<br />

catastrophic outcomes, such as nuclear power-plant meltdowns, than of risks that are<br />

expected to cause greater harm at a predictable rate, such as air pollution from the<br />

burning of fossil fuels. This was also found in the psychometric research described<br />

earlier. Public reactions seem to follow the same principle. Every day in the United<br />

States, a dozen children are killed by guns, one at a time. The public <strong>and</strong> its representatives<br />

were galvanized into action by a couple of well-publicized incidents in which<br />

several school children were shot at once. This was, as several observers noted, a<br />

“statistical blip” in the overall problem. But the fact that the deaths happened all<br />

at once made people more concerned. Can our concern with catastrophic risks be<br />

justified?<br />

One possible justification is that the utility of a life lost increases with the number<br />

of lives already lost. A disaster that kills 10,000 people could be 100,000 times worse<br />

than an accident that kills ten. But, to a first approximation, the people who matter

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!