16.05.2017 Views

The Accountant-May-June 2017

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

governance<br />

STRATEGIC<br />

DECISION MAKING<br />

A group member must act<br />

as the devil’s advocate<br />

By Ruth Gatwiri, ruthkgatwiri@gmail.com<br />

Even the best-designed strategic<br />

planning systems will fail to<br />

produce the desired results if<br />

managers do not effectively<br />

use the information at their<br />

disposal. Consequently, it is important<br />

that strategic managers learn to make<br />

better use of the information they have,<br />

and understand why they sometimes<br />

make poor decisions. One important way<br />

in which managers can make better use<br />

of their knowledge and information is to<br />

understand how common cognitive biases<br />

can result in poor decision making. <strong>The</strong><br />

rationality of decision making is bound by<br />

one’s cognitive capabilities. Humans are<br />

not supercomputers, and it is difficult for<br />

us to absorb and process large amounts of<br />

information effectively. As a result, when<br />

we make decisions, we tend to fall back on<br />

certain rules of thumb, or heuristics, that<br />

help us to make sense out of a complex<br />

and uncertain world. However, sometimes<br />

these rules lead to severe and systematic<br />

errors in the decision-making process.<br />

Systematic errors are those that appear<br />

time and time again.<br />

<strong>The</strong>y seem to arise from a series<br />

of cognitive biases in the way that<br />

humans process information and reach<br />

decisions. Because of cognitive biases,<br />

many managers may make poor strategic<br />

decisions. Numerous cognitive biases<br />

have been verified repeatedly in laboratory<br />

settings, so we can be reasonably sure that<br />

these biases exist and that all people are<br />

prone to them. <strong>The</strong> prior hypothesis bias<br />

refers to the fact that decision makers<br />

who have strong prior beliefs about<br />

the relationship between two variables<br />

tend to make decisions on the basis of<br />

these beliefs, even when presented with<br />

evidence that their beliefs are incorrect.<br />

Moreover, they tend to seek and use<br />

information that is consistent with their<br />

prior beliefs while ignoring information<br />

that contradicts these beliefs. To place<br />

this bias in a strategic context, it suggests<br />

that a CEO who has a strong prior<br />

belief that a certain strategy makes sense<br />

might continue to pursue that strategy<br />

despite evidence that it is inappropriate<br />

or failing. Another well-known cognitive<br />

bias, escalating commitment, occurs<br />

when decision makers, having already<br />

committed significant resources to a<br />

project, commit even more, even when<br />

they receive feedback that the project is<br />

failing. This may be an irrational response;<br />

a more logical response would be to<br />

abandon the project and move on (that<br />

is, to cut your losses and exit), rather<br />

than escalate commitment. Feelings<br />

of personal responsibility for a project<br />

seemingly induce decision makers to<br />

stick with a project despite evidence<br />

that it is failing. A third bias, reasoning<br />

by analogy, involves the use of simple<br />

analogies to make sense out of complex<br />

problems. <strong>The</strong> problem with this heuristic<br />

is that the analogy may not be valid. A<br />

fourth bias, representativeness is rooted in<br />

the tendency to generalize from a small<br />

sample or even a single vivid anecdote.<br />

This bias violates the statistical law of<br />

large numbers, which says that it is<br />

inappropriate to generalize from a small<br />

sample, let alone from a single case. In<br />

many respects, the dot-com boom of<br />

the late 1990s was based on reasoning<br />

by analogy and representativeness.<br />

Prospective entrepreneurs saw some of<br />

the early dot-com companies achieve<br />

rapid success, at least as judged by some<br />

metrics. Reasoning by analogy from a very<br />

small sample, they assumed that any dotcom<br />

could achieve similar success. Many<br />

investors reached similar conclusions. <strong>The</strong><br />

result was a massive wave of start-ups<br />

that jumped into the Internet space in<br />

an attempt to capitalize on the perceived<br />

opportunities. <strong>The</strong> vast majority of these<br />

companies subsequently went bankrupt,<br />

proving that the analogy was wrong and<br />

that the success of the small sample of early<br />

entrants was no guarantee that all dotcoms<br />

would succeed. A fifth cognitive bias<br />

is referred to as the illusion of control, or<br />

the tendency to overestimate one’s ability<br />

to control events. General or top managers<br />

seem to be particularly prone to this bias:<br />

having risen to the top of an organization,<br />

they tend to be overconfident about their<br />

ability to succeed. <strong>The</strong> availability error is<br />

yet another common bias. <strong>The</strong> availability<br />

error arises from our predisposition to<br />

estimate the probability of an outcome<br />

based on how easy the outcome is to<br />

imagine. For example, more people seem<br />

to fear a plane crash than a car accident,<br />

and yet statistically one is far more likely<br />

to be killed in a car on the way to the<br />

airport than in a plane crash. People<br />

overweigh the probability of a plane crash<br />

because the outcome is easier to imagine,<br />

and because plane crashes are more vivid<br />

events than car crashes, which affect only<br />

small numbers of people at one time. As<br />

a result of the availability error, managers<br />

might allocate resources to a project with<br />

an outcome that is easier to imagine,<br />

rather than to one that might have the<br />

highest return.<br />

Techniques for Improving<br />

Decision Making<br />

<strong>The</strong> existence of cognitive biases raises a<br />

question: How can critical information<br />

affect the decision-making mechanism<br />

so that a company’s strategic decisions<br />

are realistic and based on thorough<br />

evaluation? Two techniques known to<br />

enhance strategic thinking and counteract<br />

cognitive biases are devil’s advocacy and<br />

dialectic inquiry. Devil’s advocacy requires<br />

30 MAY - JUNE <strong>2017</strong>

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!