17.05.2015 Views

14-1190b-innovation-managing-risk-evidence

14-1190b-innovation-managing-risk-evidence

14-1190b-innovation-managing-risk-evidence

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

level was announced on 22 January 2010 the government’s<br />

Home Secretary felt obliged to say: “This means that a<br />

terrorist attack is highly likely, but I should stress that there<br />

is no intelligence to suggest than an attack is imminent” 1 .<br />

This shows that using words to express uncertainty, without<br />

a numerical reference scale, can be misleading unless<br />

audiences fully understand their usage in the particular<br />

context.<br />

Words can also be used as direct translations of<br />

numerical probability assessments. For example, the<br />

Intergovernmental Panel on Climate Change (IPCC) has<br />

standardized its verbal terms for likelihood using the scale in<br />

Table 1 (ref.2).<br />

The IPCC sometimes use these terms for communicating<br />

their confidence in scientific conclusions, for example in<br />

stating that: “It is extremely likely that human influence<br />

has been the dominant cause of the observed warming since<br />

the mid-20th century”. The IPCC only uses these terms of<br />

likelihood in situations of ‘high’ or ‘very high’ confidence<br />

(see below for the interpretation of this term), although it<br />

can be argued that such a restriction is unnecessary.<br />

It is important that any numerical scale broadly reflects<br />

usage in that domain. The case study in this chapter, Accurate<br />

Communication of Medical Risk, notes that drug side-effects<br />

that are described as ‘common’ are intended to correspond<br />

to between 1% and 10%. This may be reasonable usage for<br />

pharmacologists, but not for patients.<br />

4. Being imprecise about numbers<br />

When it comes to expressing uncertainty in quantitative<br />

terms, a hierarchy of levels of precision can be assigned,<br />

appropriate to the confidence of the assessors. For example:<br />

• Numbers to appropriate levels of precision.<br />

• A distribution or range.<br />

• A list of possibilities.<br />

• A rough order of magnitude.<br />

TABLE 1<br />

IPCC LIKELIHOOD SCALE<br />

Term<br />

Likelihood<br />

of the outcome<br />

(probability)<br />

Virtually certain 99–100%<br />

Extremely likely 95–100%<br />

Very likely 90–100%<br />

Likely 66–100%<br />

More likely than not 50–100%<br />

About as likely as not 33–66%<br />

Unlikely 0–33%<br />

Very unlikely 0–10%<br />

Exceptionally unlikely 0–1%<br />

No analysis can claim to<br />

produce the <strong>risk</strong> of an<br />

<strong>innovation</strong>.<br />

5. Expressing (lack of) confidence in the science<br />

Our uncertainty does not just concern the likelihood of<br />

future outcomes. For a whole modeling process, we may<br />

have to contend with assumptions about the inputs, the<br />

model structure, and the issues of concern. Again, lists of<br />

possible values or scenarios can be provided as a sensitivity<br />

analysis. These can be supplemented by acknowledging that<br />

there are aspects that have been left out of the model, with<br />

a qualitative assessment of their potential impact.<br />

Some domains have tried to use a summary qualitative<br />

scale to communicate the confidence in the analysis. For<br />

example, the IPCC uses the quality of the <strong>evidence</strong>, and<br />

the degree of scientific agreement, to assess a level of<br />

confidence in their scientific conclusions.<br />

Seeking a common language<br />

There is room for substantial improvement in the quality<br />

of the discourse between the participants involved in<br />

<strong>innovation</strong> and <strong>risk</strong>, including the public, regulators,<br />

innovators, NGOs and the media. But “seeking a common<br />

language” does not refer to a restricted use of specific<br />

words, such as <strong>risk</strong>, uncertainty and ambiguity — that would<br />

be impossible to impose and counter-productive to try.<br />

Rather, by “common language” I mean the acceptance of a<br />

set of principles regarding the presentation of arguments or<br />

analyses. Five such principles are outlined below.<br />

1. Acknowledge that any <strong>risk</strong> and uncertainty assessment is<br />

contingent and provisional. No analysis can claim to produce<br />

the <strong>risk</strong> of an <strong>innovation</strong>. Such assessments are constructed<br />

depending on assumptions, and may change in receipt of<br />

further information. The reasons for the uncertainty should<br />

also be given. And since there will be competing opinions,<br />

not all of which will be based on an adequate consideration<br />

of the <strong>evidence</strong>, the pedigree of an analysis is also important.<br />

It would be useful to have an appraisal of the quality of<br />

the analytic framework: this could be self-assessed by the<br />

analysts (although this would require considerable humility)<br />

but also assessed by the team responsible for formulating<br />

the policy in question. The appraisal should be based on<br />

the quality, quantity, consistency and coverage of <strong>evidence</strong>,<br />

as well as the quality of the process that collected and<br />

analyzed the <strong>evidence</strong>, and the quality of deliberation.<br />

Policymakers can then make a holistic summary of the state<br />

of uncertainty and how it influences the confidence in the<br />

conclusion (see Table 2 for a possible rough categorization<br />

of such an assessment, roughly modeled on the Grading of<br />

Recommendations Assessment, Development and Evaluation<br />

(GRADE) scale used in health 3 ).<br />

Clearly the strength of any recommendation does not<br />

75

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!