31.01.2013 Views

oCtoBeR 2010 - American Association for Clinical Chemistry

oCtoBeR 2010 - American Association for Clinical Chemistry

oCtoBeR 2010 - American Association for Clinical Chemistry

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

patient safetY concepts<br />

The Monday Morning Quarterback<br />

Hindsight Bias in Medical Decision Making<br />

By KaRen aPPolD<br />

Have you ever heard someone say, “I knew<br />

that was going to happen?” But everyone<br />

knows that predicting the future always becomes<br />

much easier once the future is in the<br />

past. Sayings like “hindsight is 20/20” and<br />

“Monday morning quarterback” are also<br />

popular ways to describe such predictions.<br />

These expressions refer to what psychologists<br />

call hindsight bias, the tendency <strong>for</strong><br />

people to regard past events as expected or<br />

obvious, even when in real time the events<br />

perplexed those involved.<br />

Safety analysis experts are now applying<br />

the hindsight bias model to medical decision<br />

making. Here hindsight bias describes<br />

the tendency to judge the events leading<br />

up to an accident as errors because the bad<br />

outcome is known (1).<br />

Take, <strong>for</strong> example, the way in which<br />

clinicians interpret a lab result. The clinician<br />

has a certain amount of in<strong>for</strong>mation<br />

that allows him to assess the probability of<br />

a diagnosis or likelihood that a condition<br />

will worsen. He then makes decisions based<br />

on those probabilities. However, when authorities,<br />

like a consulting clinician or an<br />

expert witness, assess whether or not the<br />

treating physician made a good decision,<br />

and the outcome is unknown, the experts<br />

will often support the decision. But if the<br />

experts learn that a patient had a negative<br />

outcome, then they are more likely to think<br />

that the decision was poor. In other words,<br />

knowing the outcome in the present affects<br />

the experts’ analysis of events in the past.<br />

This thinking process is a universal human<br />

phenomenon. “Cognitive psychologists<br />

have realized that hindsight bias is<br />

almost unavoidable. When a negative outcome<br />

occurs, the tendency is to think that if<br />

a different action would have been taken, the<br />

outcome could have been prevented,” explained<br />

Rodney A. Hayward, MD, director<br />

of the Robert Wood Johnson Foundation<br />

<strong>Clinical</strong> Scholars Program and professor of<br />

medicine and public health at the University<br />

of Michigan in Ann Arbor. “Ultimately, a decision<br />

is made with the intent to reduce the<br />

risk of something bad happening. But any<br />

action bears certain costs and risks.”<br />

Hayward recalls a particular scenario<br />

involving a lab test result that shows how<br />

hindsight bias can easily happen. A patient’s<br />

creatinine level increased from 1.3 to 1.6<br />

units over 3 months. The patient appeared<br />

to be fairly stable, so the clinicians weren’t<br />

too concerned and scheduled a follow-up<br />

visit in 3 months. Three weeks later, however,<br />

the patient was hospitalized because his<br />

creatinine level spiked. Upon reflection, the<br />

clinicians questioned whether they should<br />

have re-checked the patient sooner.<br />

“But the reality is that the difference between<br />

those two creatinine levels is not that<br />

unusual and is almost within the range of<br />

lab error,” Hayward said. “In addition, it is<br />

unclear if that slight bump was a warning<br />

sign. If we re-checked everyone with such a<br />

slight increase, we would constantly inconvenience<br />

patients and it would be costly.”<br />

This is a good example of hindsight<br />

bias in medical decision making because<br />

when something negative happened (the<br />

patient was hospitalized), the original decision<br />

(to not re-check him earlier) was questioned.<br />

But if the decision was evaluated in<br />

monday morning quarterbacking in medical decision making can be minimized<br />

if decisions are evaluated based on the in<strong>for</strong>mation available at the<br />

time of the decision, and not after the outcome is known.<br />

examples of hindsight bias<br />

® “i knew they were going to lose.”<br />

® “that’s exactly what i thought was going to<br />

happen.”<br />

® “i saw this coming.”<br />

® “that’s just common sense.”<br />

® “i had a feeling you might say that.”<br />

real-time, most likely everyone would have<br />

agreed that a routine 3-month follow up appointment<br />

was appropriate and the slightly<br />

higher creatinine level was not of concern.<br />

To minimize hindsight bias in this instance,<br />

clinicians should question whether<br />

a patient with such a small change in creatinine<br />

level in a 3-month period should be<br />

rechecked weekly. In general, a good technique<br />

<strong>for</strong> minimizing hindsight bias is to<br />

<strong>for</strong>mally consider alternative explanations<br />

that do not involve errors.<br />

In fact, making too many changes based<br />

on small random errors in lab tests can actually<br />

have negative outcomes. “Some research<br />

suggests that if you check an international<br />

normalized ratio (INR) to monitor<br />

warfarin therapy too frequently and make<br />

constant adjustments, you will actually do<br />

less good than if you check it at slightly<br />

longer intervals and make smaller adjustments.<br />

This is because you may overcorrect<br />

<strong>for</strong> random variations in INR by getting<br />

too much in<strong>for</strong>mation from lab tests,”<br />

Hayward explained.<br />

To some degree, hindsight bias can’t<br />

be completely avoided. “When we criticize<br />

certain decisions and policies, we need to<br />

consider the cost, as well as the risk and<br />

consequences of alternative actions.”<br />

One problem with hindsight bias is<br />

that it leads us to criticize or even punish<br />

healthcare workers who make decisions<br />

that lead to poor patient outcomes. To<br />

avoid the negative consequences of hindsight<br />

bias, Hayward advises against adopting<br />

new policies or punishing employees<br />

without reflecting on the full process. It is<br />

better to evaluate whether the decision was<br />

reasonable and determine the best action<br />

to take in the future.<br />

REFERENCE<br />

1. Hindsight bias. Available at: http://psnet.<br />

ahrq.gov/glossary.aspx#hindsightbias.<br />

Accessed August 13, <strong>2010</strong>.<br />

Karen Appold is an editorial consultant <strong>for</strong><br />

the clinical laboratory industry.<br />

Email: karenappold@comcast.net<br />

patient safety focus<br />

editorial board<br />

chair<br />

michael astion, md, phd<br />

department of laboratory medicine<br />

university of washington, seattle<br />

members<br />

peggy a. ahlin, bs, mT(ascp)<br />

arup laboratories<br />

salt lake City, utah<br />

James s. hernandez, md, ms<br />

mayo Clinic arizona<br />

scottsdale and phoenix<br />

devery howerton, phd<br />

Centers <strong>for</strong> disease Control<br />

and prevention<br />

atlanta, ga.<br />

CliniCal laboratory news <strong>oCtoBeR</strong> <strong>2010</strong> 15

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!