02.03.2013 Views

Thinking and Deciding

Thinking and Deciding

Thinking and Deciding

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

286 DESCRIPTIVE THEORY OF CHOICE UNDER UNCERTAINTY<br />

following scenario (Tversky <strong>and</strong> Shafir, 1992a): “Imagine that you have just taken a<br />

tough qualifying examination. It is the end of the fall quarter, you feel tired <strong>and</strong> rundown,<br />

<strong>and</strong> you are not sure that you passed the exam. In case you failed you have to<br />

take the exam again in a couple of months — after the Christmas holidays. You now<br />

have an opportunity to buy a very attractive five-day Christmas vacation package<br />

to Hawaii at an exceptionally low price. The special offer expires tomorrow, while<br />

the exam grade will not be available until the following day.” Subjects were asked<br />

whether they would buy the package, not buy it, or “pay a $5 nonrefundable fee in<br />

order to retain the rights to buy the vacation package at the same exceptional price<br />

the day after tomorrow — after you find out whether or not you passed the exam.”<br />

Sixty-one percent chose to pay the $5. Only 32% would buy the package. When<br />

asked what they would do if they knew that they had passed or knew that they had<br />

failed, however, most subjects would buy the package in each condition, <strong>and</strong> only<br />

31% would pay $5 to delay the decision for two days. It seems that people would<br />

take the vacation to celebrate, if they passed, <strong>and</strong> to gather their strength if they<br />

failed, but, if they did not know their reasons, they preferred not to decide until they<br />

did.<br />

A similar tendency to defer decisions or to “do nothing” results from conflict, that<br />

is, from having reasons to choose or reject more than one option (Tversky <strong>and</strong> Shafir,<br />

1992b). For example, in one experiment, subjects had filled out a questionnaire <strong>and</strong><br />

expected to be paid $1.50. Half of the subjects were offered a metal pen worth about<br />

$2 instead of their payment, <strong>and</strong> only 25% of these subjects took the $1.50, the rest<br />

taking the pen. The other half of the subjects were offered a choice of the same pen<br />

or two plastic pens. Now 53% of these subjects took the money instead of either of<br />

the other options. As in the vacation study just described, people need clear reasons<br />

to ab<strong>and</strong>on the default option: delaying the vacation decision or taking the money.<br />

The heuristic of not acting without reasons is, of course, generally a good one. But<br />

if you would take the same action in all possible states of the world — despite the<br />

reasons being different in different states — then you might as well decide to take it.<br />

You do have reasons, although your reasons may not be known yet.<br />

Conclusion<br />

We have seen in this chapter that several factors — not all fully understood — lead<br />

us to violate expected-utility theory in its simple form. Some apparent violations,<br />

such as those caused by regret or disappointment, are not necessarily violations at<br />

all. In these cases, an overly simple analysis could have neglected real emotional<br />

consequences of decisions. It is also possible, however, that we sometimes fail to<br />

consider the option of trying to control our emotional responses in order to achieve<br />

our remaining goals.<br />

Other violations of the theory, such as ambiguity effects, might result from our<br />

using generally useful heuristics in situations in which they are harmful rather than<br />

helpful in achieving our goals. Such biases can result from “heuristics based on

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!