05.04.2013 Views

The Nimrod Review - Official Documents

The Nimrod Review - Official Documents

The Nimrod Review - Official Documents

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

<strong>The</strong> <strong>Nimrod</strong> <strong>Review</strong><br />

462<br />

<strong>The</strong> three year Dryden Inquiry presented a rare opportunity for an independent body<br />

to examine the entire Canadian aviation system for organisational failures, both latent<br />

and active, which might have contributed to the Captain’s faulty decision, and to make<br />

recommendations for necessary change. It was an opportunity not to be squandered.” 16<br />

18.13 <strong>The</strong> authors of Beyond Aviation Human Factors point that, even if pilot error had not occurred on this<br />

occasion and the crash of Flight 1363 had not taken place, “…the flawed organisational processes and<br />

the latent failures would still remain. Dryden would take place elsewhere, given the degree of sickness, the<br />

numerous pathogens hidden in the system17 .<br />

Accident <strong>The</strong>ories<br />

18.14<br />

<strong>The</strong>re are two main accident theories:<br />

18.14.1 Normal Accident <strong>The</strong>ory:<br />

‘Normal Accident <strong>The</strong>ory’ holds that, when technologies become very<br />

complex and ‘tightly coupled’, accidents become inevitable and therefore, in a sense, ‘normal’. 18<br />

This theory takes a pessimistic, but not defeatist, view of the ability of organisations and individuals<br />

to manage high risk technologies.<br />

18.14.2 High Reliability <strong>The</strong>ory:<br />

‘High Reliability <strong>The</strong>ory’ argues that organisations responsible for<br />

operating high risk technologies can successfully compensate for inevitable human shortcomings<br />

which would normally lead to catastrophic failures. Proper design, management and training are<br />

seen as important requisites for being a highly reliable organisation.<br />

Normal Accident <strong>The</strong>ory<br />

18.15<br />

Normal Accident theorists recognise that high-risk technologies are prone to accidents and catastrophes<br />

when the right combination of circumstances come together to defeat safety devices and/or the efforts of<br />

those responsible for coping with such events. <strong>The</strong> theory recognises the possibility of such events, rather<br />

than being a statement of despair or “learned helplessness”. 19 However, Charles Perrow’s basic pessimism is<br />

unmistakable:<br />

“No matter how hard we try, no matter how much training, how many safety devices,<br />

planning redundancies buffers, alarms, bells and whistles we build into our systems, those<br />

that are complexly interactive will find an occasion where the unexpected interaction of two<br />

or more failures defeats the training, the planning, and the design of safety devices.” 20<br />

“If interactive complexity and tight coupling – system characteristics – inevitably will<br />

produce an accident, I believe we are justified in calling it a normal accident, or a system<br />

accident. <strong>The</strong> odd term ‘normal accident’ is meant to signal that, given the system<br />

characteristics, multiple and unexpected interactions of failures are inevitable. This is an<br />

expression of an integral characteristic of the system, not a statement of frequency…<br />

System accidents are uncommon, even rare; yet this is not all that reassuring, if they can<br />

produce catastrophes.” 21<br />

16 See Address by <strong>The</strong> Honourable Virgil P Moshansky, CM QC – <strong>The</strong> Halifax 5 – Canadian Health Care Safety Symposium At Calgary, Alberta, Saturday<br />

22 October 2005.<br />

17 Ibid, page 81.<br />

18 See the work of Charles Perrow, a leading proponent of the ‘Normal Accident <strong>The</strong>ory’.<br />

19 James Reason.<br />

20 Cited by Scott D Sagan in <strong>The</strong> Limits of Safety, 1993, page 45.<br />

21 Charles Perrow, Normal Accidents – Living With High-Risk Technologies, 1984, page 4. (Princeton publishing).

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!