Command Red Team
2gWzzvB
2gWzzvB
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
<strong>Red</strong> <strong>Team</strong> Challenges<br />
(2) Confirmation Bias. There is a natural human tendency for analysts to see<br />
what they expect to see; to actively, but selectively, search for information that confirms<br />
their beliefs; to interpret new information in ways that reinforce their hidden assumptions<br />
and existing paradigms; and to overlook, discount, or misinterpret information that might<br />
contradict their preconceived notions. Analysis can be compromised when analysts see<br />
what they expect to see, and the impact of analytical products can be degraded when<br />
consumers hear what they expect to hear. One red team role is to propose tests to<br />
validate or invalidate such hidden assumptions. Encouraging analysts and decision<br />
makers to re-evaluate their biases is one of the more difficult red team tasks, requiring<br />
tact and patience.<br />
(3) Status Quo Bias. Analysts, staff officers, and decision makers often<br />
unconsciously assume that the future must resemble the past, that current trends will<br />
continue indefinitely without change, or that conditions may slowly evolve at a<br />
convenient, manageable pace. The potential for fundamental, revolutionary change may<br />
be dismissed without a rigorous examination of visible indicators of change. The red<br />
team may help counter this “status quo bias” by promoting analysis of competing<br />
hypotheses (ACH), “what if?” analysis (also known as “backcasting”), exploration of<br />
signposts of change, or similar techniques.<br />
(4) Fighting the Plan. After an individual or group has worked to produce an<br />
assessment or plan, there is a natural resistance to seeing and accepting reasons to revise<br />
the completed product. The conditions or situation that the original product was based on<br />
may have changed, but the authors may be predisposed to dismiss evidence that their<br />
work needs to be re-accomplished. This sunk cost bias could devolve into efforts to<br />
promote work that has been overcome by events or to execute plans regardless of new<br />
conditions and requirements. Critical reviews may help highlight mismatches between<br />
existing products and the actual environment.<br />
(5) Paradox of Expertise. The more an individual or organization is invested<br />
in a particular conceptual framework that has worked for them in the past, the more<br />
difficult it will be for them to accept new evidence that does not fit into that framework.<br />
This can leave the individual or staff vulnerable to missing early indications of radical<br />
change or misinterpreting the current situation. Key assumptions checks may help<br />
uncover inconsistencies in a conceptual framework, but it will often take time and<br />
patience to persuade an experienced staff officer to abandon familiar assumptions that<br />
have been successful in the past.<br />
“We tend to perceive what we expect to perceive.”<br />
Richards J. Heuer<br />
“Psychology of Intelligence Analysis,” 1999<br />
(6) Mirror Imaging. A common error is to unconsciously assume that the<br />
thinking and actions of an adversary or other actor are based on the same values, cultural<br />
imperatives, doctrines, perceptions, operational requirements, and limiting factors that<br />
guide friendly decisions. A psychological drive for coherence often causes individuals to<br />
III-3