15.07.2013 Views

Download - Royal Australian Navy

Download - Royal Australian Navy

Download - Royal Australian Navy

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

28 NAVY ENGINEERING BULLETIN SEPTEMBER 2003<br />

BY CAPTAIN ANDREW CAWLEY,<br />

COMMANDING OFFICER,<br />

HMAS CRESWELL<br />

About the author: Andrew Cawley joined<br />

the RAN in 1982, undertaking general<br />

Junior Officer training as a Sub-Lieutenant<br />

in HMAS VAMPIRE from October. In August<br />

1983, he joined HMAS PERTH as a<br />

Systems Engineer Officer. Promoted to<br />

Lieutenant in April 1984, he posted to<br />

HMAS CERBERUS for Weapons Engineer<br />

specialist training in late 1984. He<br />

returned to sea in HMAS HOBART in early<br />

1985 and later HMAS BRISBANE as a<br />

Systems Engineer Officer.<br />

Captain Cawley joined the <strong>Navy</strong> Office<br />

Directorate of Fleet Engineering Policy at<br />

the end of 1985 as the Systems Engineer<br />

for non-USN weapons systems.<br />

Moving to Perth in early 1989, he joined<br />

the small team building up commercial<br />

refitting practices for <strong>Navy</strong> in Western<br />

Australia. He joined HMAS STUART as<br />

Weapons Electrical Engineering Officer<br />

(WEEO) in October 1989 and was<br />

promoted Lieutenant Commander in mid<br />

1990. When STUART decommissioned<br />

from the RAN in May 1991, he returned to<br />

<strong>Navy</strong> Office in Canberra as the Staff Officer<br />

for Engineer Officer career management in<br />

the Directorate of Naval Officers’ Postings.<br />

In June 1993, Captain Cawley returned to<br />

sea as WEEO of HMAS PERTH, which<br />

included the milestone of passing<br />

management of high power generation and<br />

distribution in <strong>Navy</strong> to the Marine Engineer<br />

in August 1994. Selected for promotion in<br />

December 1994, he left HMAS PERTH to<br />

undertake a five month project with Naval<br />

Support Command to establish<br />

engineering support models and processes<br />

in preparation for the full<br />

commercialisation of <strong>Australian</strong> Defence<br />

Industries (ADI) and the dockyards. In May<br />

1995, Captain Cawley then posted as OIC<br />

of the <strong>Navy</strong>’s Technical Training Centre at<br />

HMAS CERBERUS. Seconded to the Joint<br />

Education and Training Executive in May<br />

1997, Captain Cawley undertook the<br />

Defence Efficiency Review related study<br />

into ADF Technical Training Rationalisation.<br />

The Report was tendered in February<br />

1998.<br />

Before joining Maritime Command in June<br />

1999, Captain Cawley completed a<br />

Masters of Engineering Management,<br />

specialising in Systems Engineering, at the<br />

University of Technology, Sydney.<br />

On 01 July 2000, Captain Cawley was<br />

promoted to his current rank. He was<br />

posted to HMAS CRESWELL in October<br />

2000 where he takes up his dual role as<br />

Commanding Officer, HMAS CRESWELL<br />

and Training Authority, Initial Training<br />

Leadership and Management. Captain<br />

Cawley is married to Anna Glynne and lives<br />

at HMAS CRESWELL in Jervis Bay.<br />

A Risk—Is It Really?<br />

Over the last few years, the culture of carefully considering risk in our<br />

decision making has taken hold. People are well attuned to querying<br />

what risks are associated with an evolution and it is commonplace for<br />

people to do an “HRA”. But, is it really a risk or is it something else?<br />

The process of an Hazard Risk<br />

Assessment is well documented<br />

in the NAVSAFE Manual, ABR<br />

6303. Risk is calculated<br />

according to the <strong>Australian</strong><br />

Standard, AS4360. Most people<br />

can tell you that you calculate<br />

risk as the product of<br />

consequence and probability. The<br />

issue I’d like to raise in this brief<br />

article is, what do you really know<br />

about the probability of an event<br />

occurring?<br />

Many times I have heard people<br />

say ‘the probability of that<br />

occurring is very low’. That sort of<br />

an answer is either a fact or a<br />

guess, and too often it is a guess.<br />

How many people do you think<br />

say that because they have a ‘gut<br />

feeling’ it does not happen very<br />

often, or because they<br />

themselves have never witnessed<br />

such an event? How many times<br />

have you heard people offer such<br />

a judgement about something<br />

they actually have no expertise<br />

in?<br />

In 1992, Brian Wynne, a UK<br />

academic working in the field of<br />

environmental risk, developed<br />

what he called his Taxonomy 1 of<br />

Risk. The first level is where we<br />

know about the behaviour of a<br />

system 2 and we can model the<br />

probability of something<br />

happening. That is to say we can<br />

determine a mathematical<br />

probability, 0.0 < p < 1.0. If we<br />

know about consequence, then<br />

we can calculate risk and use this<br />

to guide our decision making.<br />

If we possess specification about<br />

system design (variables) and<br />

operation, we can claim we know<br />

something about system<br />

behaviour, but we cannot simply<br />

extend that claim to say we know<br />

about the probability distribution<br />

of various events occurring. There<br />

must be specific, objective<br />

knowledge about event<br />

probability. If we do not know<br />

about the probability distribution<br />

of events, we are not able to<br />

assign a mathematical value to<br />

probability and we therefore<br />

cannot calculate risk. This is<br />

uncertainty.<br />

But it does not end there, Wynne<br />

takes the issue two steps further.<br />

People can generally accept they<br />

do not understand something<br />

about a system that lies before<br />

them because they do not have<br />

the expertise to work something<br />

out. But regardless of whether you<br />

are an expert or a novice, there is<br />

always the unknown. What<br />

decision makers need to<br />

remember is there may be events<br />

(probability) or hazards<br />

(consequences) we simply do not<br />

know about. The issue for<br />

decision-makers is to be careful<br />

of ignorance, be careful of people<br />

who say they have worked it all<br />

out and the (∑) risk is known<br />

and acceptable.<br />

The next level in Wynne’s<br />

taxonomy is the unknowable.<br />

Given time and effort, we may<br />

discover new information about a<br />

system and convert the unknown<br />

into the known. In the context of<br />

risk management, the ‘known’ will<br />

become a calculable risk if we<br />

understand its probability<br />

distribution (and consequence) or<br />

it will become uncertainty.<br />

However, when ‘causal chains and<br />

networks are open’, there is<br />

always the unknowable.<br />

In short, Wynne has usefully<br />

distinguished a taxonomy:<br />

• Risk—where you know the<br />

probability distribution<br />

• Uncertainty—where you do not<br />

know the probability distribution<br />

• Unknown; and<br />

• Unknowable.<br />

So, what is the point of this. Next<br />

time someone says ‘the<br />

probability of that occurring is<br />

very low’, test them about what<br />

they really know about the<br />

probability distribution. If they are<br />

just guessing, then tell them it is<br />

not risk, it is uncertainty! While<br />

that might be a play on words,<br />

what is important is you know the<br />

difference between guesswork<br />

and science and base you<br />

decisions accordingly.<br />

1 Taxonomy:- Classification, especially in<br />

relation to its general laws and principles, a<br />

systematic classification.<br />

2 Meaning in the broadest sense of the<br />

word system, not simply electro-mechanical<br />

devices.<br />

References:<br />

Wynne, B., 1992. Uncertainty and<br />

environmental learning: reconceiving<br />

science and policy in the preventative<br />

paradigm, Global Environmental Change<br />

2(2), 111-127.<br />

Irish, J., 1998. Risk, Uncertainty, Ignorance<br />

and Indeterminancy, University of<br />

Technology, Sydney.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!