02.03.2013 Views

Thinking and Deciding

Thinking and Deciding

Thinking and Deciding

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

PROSPECT THEORY 265<br />

asserts that one’s choices ought to depend on the situation itself, not on the way it is<br />

described. In other words, when we can recognize two descriptions of a situation as<br />

equivalent, we ought to make the same choices for both descriptions. Subjects seem<br />

to violate this principle. 3 The invariance principle would seem to be a principle of<br />

rational choice that is at least as fundamental as other principles we have assumed as<br />

part of utility theory, such as transitivity <strong>and</strong> the sure-thing principle. Violations of<br />

the invariance principle are also called framing effects, because the choice made is<br />

dependent on how the situation is presented, or “framed.”<br />

Note that π(p), unlike p itself, is not additive. In general, π(p)+π(1 − p) < 1.<br />

Therefore, we cannot assume that the 1.00 probability of $1,000 in the Allais paradox<br />

can be psychologically decomposed (as in Table 11.2) into .01 + .10 + .89. At least<br />

part of the bias shown in the Allais paradox is caused by the certainty effect operating<br />

on Option 1. 4<br />

Is the certainty effect rational? Why should we not weigh certain (sure) outcomes<br />

more than uncertain ones? One reason why not is that it leads us to more inconsistent<br />

decisions, decisions that differ as a function of the way things are described to us (or<br />

the way we describe things to ourselves). Second, our feeling of “certainty” about<br />

an outcome is often, if not always, an illusion, or, to put it more precisely, another<br />

sort of artifact of the way things are described (as we noted in Chapter 16). For<br />

example, you may think of ($30) as a certain outcome: You get $30. Unless having<br />

money is your only goal in life, though, the $30 is really just a means to other ends.<br />

You might spend it on tickets to a football game, for example, <strong>and</strong> the game might<br />

be close, <strong>and</strong> so exciting that you tell your gr<strong>and</strong>children about it — or it might be<br />

a terrible game, with the rain pouring down, <strong>and</strong> you, without an umbrella, having<br />

to watch your team get slaughtered. You might use the money to buy a book that<br />

enlightens you more than a year of college — or a book that turns out to be a lot of<br />

trash. In short, most, if not all, “certain” outcomes can be analyzed further, <strong>and</strong> in<br />

doing so one finds, on close examination, that the outcomes are themselves gambles.<br />

The description of an outcome as certain is not certainty itself.<br />

An important consequence of the certainty effect (McCord <strong>and</strong> de Neufville,<br />

1985) is the conclusion that people do not conform to the assumptions underlying<br />

the method of gambles when this method is used to measure utility. When people<br />

say that they are “indifferent” between ($5) <strong>and</strong> ($20, .5), we cannot assume that their<br />

utility for $5 is literally halfway between that of $0 <strong>and</strong> that of $20. They underweigh<br />

3 The different patterns of choices in the Allais paradox, depending on whether the situation is presented<br />

in a table or not, are another example of violation of the invariance principle.<br />

4 Quiggin (1982), Segal (1984), <strong>and</strong> Yaari (1985) have shown that the major results ascribed to the<br />

π function, including the certainty effect, can be accounted for by other transformations of p. These<br />

transformations avoid the following problem: According to prospect theory taken literally, a person might<br />

prefer ($5.01, .05; $5.02, .05) to ($5.03, .10), even though we might think of ($5.03, .10) as ($5.03, .05;<br />

$5.03, .05), which is clearly better than the first option. This preference is possible if π(.05) is sufficiently<br />

large compared to π(.10)/2. It is called a violation of stochastic dominance. The way to avoid violations<br />

of stochastic dominance is to rank the outcomes in order of preference <strong>and</strong> calculate, for each outcome,<br />

the probability Q of doing at least as well as that outcome. These Qs can be transformed freely, as long<br />

as Q is 1 for the worse outcome. Reviews of other recent developments of this sort are found in Fishburn<br />

(1986), Machina (1987), Sugden (1986), <strong>and</strong> Weber <strong>and</strong> Camerer (1987).

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!