05.04.2013 Views

The Nimrod Review - Official Documents

The Nimrod Review - Official Documents

The Nimrod Review - Official Documents

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

17.33.2<br />

17.33.3<br />

Chapter 17 – Columbia and Other Lessons<br />

“Years of workforce reductions and outsourcing have culled from NASA’s workforce the layers<br />

of experience and hands on systems safety that once provided a capacity for oversight. Safety<br />

and Mission assurance personnel have been eliminated, careers in safety have lost organisational<br />

prestige and the program now decides on its own how much safety and engineering oversight it<br />

needs.” 31<br />

“NASA’s culture of bureaucratic accountability emphasised chain of command, procedure, following<br />

the rules, and going by the book. While rules and procedures were essential for co-ordination,<br />

they had an unintended but negative effect. Allegiance to hierarchy and procedure had replaced<br />

deference to NASA’s engineers’ technical expertise”. 32<br />

CAIB’s conclusions on organisational problems at NASA<br />

17.34 <strong>The</strong> CAIB’s conclusions on the deep-seated organisational problems that it found in NASA included the following<br />

thoughts:<br />

17.34.1 Leaders create culture. It is their responsibility to change it. Top administrators must take responsibility<br />

for risk, failure, and safety by remaining alert to the effects their efforts have on the system. Leaders<br />

are responsible for establishing the conditions that lead to their subordinates’ successes or failure.<br />

<strong>The</strong> past decisions of national leaders, the White House, Congress, and NASA headquarters, set<br />

the Columbia accident in motion by creating resource and schedule strains that compromised<br />

the principles of a high-risk technology organisation. NASA’s success became measured by cost<br />

reduction and whether the schedule was met.<br />

17.34.2 Changes in organisational structure should be made only with careful consideration of their effect on<br />

the system and their possible unintended consequences. Changes that make an organisation more<br />

complex may create new ways that it can fail. When changes are put in place the risk of error initially<br />

increases as old ways of doing things compete with new. Institutional memory is lost as personnel<br />

and records are moved and replaced. Changing the structure of organisations is complicated by<br />

external budgetary and political constraints, the inability of leaders to conceive the full ramifications<br />

of their actions, and the failure to learn from the past. Nonetheless, change must be made. NASA’s<br />

blind spot was that it believed it had a strong safety culture. Programme history shows that the<br />

loss of a truly independent, robust capability to protect the system’s fundamental requirements and<br />

specifications inevitably compromised those requirements, and therefore increased risk.<br />

17.34.3 Strategies must increase the clarity, strength and presence of signals that challenge assumptions<br />

about risk. Twice in NASA’s history the agency embarked down a slippery slope that resulted in<br />

catastrophe. Each decision taken by itself seemed correct, routine and indeed insignificant and<br />

unremarkable, yet their effect was stunning. A safety team must have equal and independent<br />

representation so that managers are not again lulled into complacency by shifting definitions of risk.<br />

It is worth acknowledging that people who are marginal and not in positions of power may have<br />

useful information that they do not express. Even when such people are encouraged to speak, they<br />

find it intimidating to contradict a leader’s strategy or a group consensus.<br />

17.34.4 Responsibility and authority for decisions involving technical requirements and safety should rest<br />

with an independent technical authority. <strong>The</strong> CAIB said there were lessons to be learned from other<br />

organisations that had been accident-free, including e.g. the US Navy Submarine and Reactor Safety<br />

programs: 33 “Organisations that successfully operate high risk technology have a major characteristic<br />

in common: they place a premium on safety and reliability by structuring their programs so that<br />

technical and safety engineering organisations own the process of determining, maintaining and<br />

waiving technical requirements with a voice that is equal to yet independent of Program managers<br />

31 CAIB Report, page 181.<br />

32 CAIB Report, page 200.<br />

33 CAIB Report, page 182.<br />

453

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!