12.07.2015 Views

Safety culture: transferring theory and evidence ... - Industrial Centre

Safety culture: transferring theory and evidence ... - Industrial Centre

Safety culture: transferring theory and evidence ... - Industrial Centre

SHOW MORE
SHOW LESS
  • No tags were found...

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Behavioural Research in Road <strong>Safety</strong>: 10th Seminarincreasingly important in bridging European <strong>and</strong> North American approaches to riskmanagement, to high reliability organisations, <strong>and</strong> to the investigation <strong>and</strong> prevention oflarge-scale accidents. The increasing globalisation of many systems of production <strong>and</strong>finance, <strong>and</strong> dependence upon organisational systems as we approach the millennium <strong>and</strong>beyond, means that these factors are of increasingly wider relevance.In traditional ‘high hazard’ industries such as nuclear, petrochemicals <strong>and</strong> aviation it isbelieved that organisational <strong>culture</strong> presents us with the next critical frontier in accidentprevention efforts as, increasingly, many of the more obvious technical <strong>and</strong> human factorsroutes for safety improvement are exhausted. Furthermore, attempts to measure <strong>and</strong>improve safety <strong>culture</strong> are moving beyond these industries into more mundaneorganisational contexts where routine accidents occur (HSE, 1997). In the roadtransportation context one can pose the question of whether some part of the variance infleet driver behaviour, <strong>and</strong> consequent accidents, might be understood in relation to thenew <strong>theory</strong> <strong>and</strong> empirical research being generated by safety <strong>culture</strong> research?Moving from an initial concern with the modelling of organisational preconditions tofailure or hazard contemporary researchers <strong>and</strong> practitioners are now concerned to specifyhow knowledge of such factors might be used to enhance safety. Chernobyl again marks acritical juncture. Considerable <strong>evidence</strong> was available prior to 1986 of the very severeproblems of control posed by administrative failures of foresight in complex socio-technicalsystems (Turner 1976, 1978). And while the earlier Three Mile Isl<strong>and</strong> accident in the USAhad been interpreted, by Perrow (1984) at least, as <strong>evidence</strong> for the inevitability of seriousaccidents in certain forms of complex system (of which nuclear power plants were hisparadigm example), the organisational errors <strong>and</strong> violations of operating procedures whichcontributed in part to the Chernobyl disaster were interpreted by commentators to be<strong>evidence</strong> of a ‘poor safety <strong>culture</strong>’, both at this plant <strong>and</strong> within the former Soviet Unionnuclear industry more generally (OECD Nuclear Agency 1987).Of course, in the late 1980s this hypothesis stemmed much more from a rhetorical attemptto reassure Western publics that Chernobyl couldn’t happen here, than from any direct orsystematic social science analysis of the deep <strong>and</strong> complex issues involved in this question.Hence, our own research group’s review of the immediate post-Chernobyl discussions ofsafety <strong>culture</strong> critiqued its reduction to a combination of administrative procedures <strong>and</strong>individual attitudes to safety (Turner et al 1989), which seemed to us to be at the expense ofthe wider organisational issues. What was crucially missing, both then <strong>and</strong> in manycontemporary treatments, was the shared characteristic of all social organisation <strong>and</strong><strong>culture</strong>.Implicit in the original claims is the assumption that if the Chernobyl accident – or theKing’s Cross Underground fire in London, or the Exxon Valdez accident in Alaska – couldindeed be put down to a ‘poor’ safety <strong>culture</strong>, then there might surely be something, forwant of a better word, called a ‘good’ safety <strong>culture</strong>, which safety managers might thenpromote, design, or encourage in order to head off some of the worst consequences oforganisational-system failure? In the 1990s, therefore, interest in the topic of safety <strong>culture</strong>has burgeoned as engineers, risk managers <strong>and</strong> nuclear safety practitioners, <strong>and</strong> evenindustrialists, have attempted to operationalize the concept <strong>and</strong> to judge its significance. Inparallel, both risk management consultants, <strong>and</strong> to a lesser extent the applied social scienceresearch communities, have responded with a variety of conceptual frameworks (of varyingdegrees of theoretical grounding) <strong>and</strong> a rapidly exp<strong>and</strong>ing empirical research literature,particularly in Europe (see Cox <strong>and</strong> Flin, 1998).50


<strong>Safety</strong> <strong>culture</strong>: <strong>transferring</strong> <strong>theory</strong> <strong>and</strong> <strong>evidence</strong> from the major hazards industriesorganisations. It also provides the conceptual foundation for an anthropological definitionof a safety <strong>culture</strong> as the set of assumptions, <strong>and</strong> their associated practices, which permitbeliefs about danger <strong>and</strong> safety to be constructed.Turner’s data also indicate that, despite the existence of warnings <strong>and</strong> potential signals of adeteriorating situation, a number of common information h<strong>and</strong>ling difficulties (such aswrong assumptions about the significance of warnings, communication problems,uncertainty about violations of regulations, <strong>and</strong> the tendency for over optimism) mean thatorganisations can effectively go into ‘cultural denial’ during the hazard incubation period.In particular intelligence gathering <strong>and</strong> interpretation is often bounded by administrativelydefined perceptual rigidities <strong>and</strong> frames of reference regarding what is a legitimate hazard toattend to. Anomalous events, warning signals <strong>and</strong> other issues that fall outside such framesare then very readily neglected or overlooked. Nor is the solution to the problems ofinformation h<strong>and</strong>ling simply to be found in more communication, since it is equallyimportant to avoid being overwhelmed by a barrage of noise in ill-structured risk situations.Hence the model points to the need for selective gathering <strong>and</strong> attention to high qualityinformation as a response to deep forms of uncertainty. Accordingly, we have argued that atthe heart of a safety <strong>culture</strong> is the way in which organisational intelligence <strong>and</strong> safetyimagination regarding risk <strong>and</strong> danger are deployed (see Pidgeon <strong>and</strong> O’Leary 2000).<strong>Safety</strong> <strong>culture</strong>As outlined above, the man-made disasters model defines disaster incubation in terms of adiscrepancy between some deteriorating but ill-structured state of affairs <strong>and</strong> the culturally‘taken for granted’ – more specifically the norms, assumptions <strong>and</strong> beliefs adopted by anorganisation or industry for dealing with danger. One implication of this analysis is that<strong>culture</strong> is positioned at the heart of the system vulnerability problem, because of its role inshaping blindness to certain forms of hazard (ie those which are at variance with the takenfor granted). As Vaughan puts it with respect to the Challenger disaster, NASA’s <strong>culture</strong>provided ‘a way of seeing that was simultaneously a way of not seeing’ (1996, p394).The recurrent pattern of organisational <strong>and</strong> human failures coupled with mis-interpretationor concealment of ‘warnings’ also implies, in <strong>theory</strong> at least, that significant systemcharacteristics might be identified which are holistic – rather than strictly determining –indicators of the developing incubation period. It is therefore not surprising to find workdrawing upon this tradition moving on from purely retrospective analyses of failure cases topose the question of the predictiveness of the model, <strong>and</strong> in particular the conditions for‘safe’ organisation.The very public discussion of <strong>culture</strong> following Chernobyl did raise in our own minds thequestion of what a well-grounded sociological treatment of this issue would tell us? In thesociological <strong>and</strong> anthropological literature there are of course very many views on what isshared by, <strong>and</strong> defines the <strong>culture</strong> of, a social group. Sometimes <strong>culture</strong> is discussed in termsof observable behaviours (‘the way we do things around here’) <strong>and</strong> sometimes more as asystem of symbols or meanings (for example, as a shared cognitive model, or as the assemblageof stories, arguments, myths, rituals <strong>and</strong> symbols that permeate organisational life). Notethat the two approaches are to some extent interdependent, in that meaning often both53


Behavioural Research in Road <strong>Safety</strong>: 10th Seminarconstructs the ‘object’ of inquiry <strong>and</strong> is in turn constructed itself through observablebehaviour <strong>and</strong> material life.Our own approach to the issue, <strong>and</strong> that implicit in the man-made disasters model, views<strong>culture</strong> primarily in terms of the second tradition; that is, involving the exploration ofmeaning <strong>and</strong> the symbols <strong>and</strong> systems of meaning through which a given group underst<strong>and</strong>the world. A safety <strong>culture</strong> is in turn the set of assumptions, <strong>and</strong> their associated practices,which permit beliefs about danger <strong>and</strong> safety to be constructed (see, amongst others, Turneret al, 1989; Pidgeon, 1991; Turner, 1991). Such a <strong>culture</strong> is itself created <strong>and</strong> recreated asmembers repeatedly behave <strong>and</strong> communicate in ways which seem to them to be ‘natural’,obvious <strong>and</strong> unquestionable, <strong>and</strong> as such will serve to construct a particular version of risk,danger <strong>and</strong> safety.We have also argued, in more speculative fashion, that a ‘good’ safety <strong>culture</strong> might bothreflect <strong>and</strong> be promoted by at least four facets: senior management commitment to safety;shared care <strong>and</strong> concern for hazards <strong>and</strong> a solicitude over their impacts upon people; realistic<strong>and</strong> flexible norms <strong>and</strong> rules about hazards; <strong>and</strong> continual reflection upon practice throughmonitoring, analysis <strong>and</strong> feedback systems.In exploring safety <strong>culture</strong>s as a route to organisational resilience we must go beyondindividual attitudes to safety to the level of shared cognitions, <strong>and</strong> the organisationalstructures <strong>and</strong> resources which support (rather than constrict) the development oforganisational intelligence <strong>and</strong> safety imagination (Pidgeon <strong>and</strong> O’Leary, 2000) regardingrisk <strong>and</strong> danger.It is important to note finally, that to speak of an organisational safety <strong>culture</strong> as anhomogeneous entity glosses some significant complexities. First, the dynamic <strong>and</strong>potentially unstable nature of <strong>culture</strong>. Second, some authors (eg Schein, 1985) haveembraced layered models of organisational <strong>culture</strong> – differentiating between core <strong>and</strong>peripheral aspects. Leading on from this, it is not as yet clear how the beliefs <strong>and</strong> norms of asafety <strong>culture</strong> st<strong>and</strong> in relation to an organisation’s wider <strong>culture</strong> <strong>and</strong> values, let alone tothose of wider society: such things as beliefs about security, achievement, benevolence <strong>and</strong>justice.Some emerging findingsA growing <strong>and</strong> recent approach to organisational safety <strong>culture</strong> frames it in terms of a set ofindividual attitudes <strong>and</strong> practices within a hazardous work context, <strong>and</strong> in this way mapsclosely onto the earlier notion of an occupational ‘climate of safety’ (Zohar, 1980).Designers of quantitative psychometric climate scales aim to provide information on factorsinfluencing perceptions <strong>and</strong> beliefs within an organisation. One can then, in <strong>theory</strong> atleast, identify the factors which discriminate high from low accident plants, organisationsor workgroups. Whether the underlying ‘<strong>culture</strong>’ <strong>and</strong> ‘climate’ (as measured through suchquestionnaires) are one <strong>and</strong> the same is a point of some current debate (see Mearns et al,1998). Also, as Cox <strong>and</strong> Flin (1998) correctly point out, there is currently not enoughconsistent (or published) data to be able to test the reliability of existing definitions ormeasures. Nor, as we note above, is there sufficient theoretical debate around the various54


<strong>Safety</strong> <strong>culture</strong>: <strong>transferring</strong> <strong>theory</strong> <strong>and</strong> <strong>evidence</strong> from the major hazards industriesmeasures used. However, the emerging findings from those psychometric studies that arepublished, if taken with <strong>evidence</strong> <strong>and</strong> arguments from organisational accident <strong>and</strong> highreliability studies, do point to some of the issues that now appear to be most critical to safeperformance. The paper briefly discusses a number of these in turn.Management commitment to safetyA number of authors have argued that top management commitment is an essentialingredient of a safety <strong>culture</strong> (Cohen, 1977; Zohar, 1980; Pidgeon, 1991). This is particularlyimportant because attempts to promote enduring organisational change are unlikely tosucceed if senior management are not seen to be closely involved <strong>and</strong> committed toinitiatives. Employees will quickly sense where management’s true priorities lie (egoptimising production), <strong>and</strong> may conform to these tacit norms even when they conflictwith explicit policy statements (always running a safe fleet). According to Zohar (1980)one indirect sign to employees of management commitment will be the perceived statuswithin the organisation of the personnel directly dealing with safety. Confirming Zohar’sinitial work in this area, subsequent psychometric studies by Guest et al (1994), Flin et al(1996), <strong>and</strong> Cheyne et al (1998) all point to the issue of perceived managementcommitment – which is also bound up with the relationship of trust (or otherwise) betweenworkers <strong>and</strong> management – as critical to shaping attitudes towards safety <strong>and</strong> riskperceptions.Attitudes towards violations of rules <strong>and</strong>proceduresAccidents that result from violations of rules <strong>and</strong> procedures, <strong>and</strong> the tension between theneeds of safety <strong>and</strong> production, have been the focus of theoretical <strong>and</strong> empirical study bythe Manchester/Leiden research group (Free, 1994; Reason et al, 1994; Hudson et al, 1997).In safety climate studies both Cheyne et al (1998) <strong>and</strong> Mearns et al (1998) report attitudestowards violations (questionnaire items such as ‘accidents are tolerated as a part of the job’or ‘need to take shortcuts’) as one of the important determinants of overall safety attitudes.Vaughan’s excellent (1996) ‘historical ethnography’ of the events at NASA <strong>and</strong> itscontractor organisations prior to the Challenger disaster points to some of the reasons why‘violation’ can be come the rule rather than the exception. The technical reason for thisdisaster is now well known; the failure of viton O-ring seals in the solid rocket boostersshortly after the launch, leading to the structural break-up <strong>and</strong> catastrophic loss of thesystem <strong>and</strong> its crew. Vaughan points out that the shuttle <strong>and</strong> its operation were at thecutting edge of high performance aerospace systems, where it is often impractical to build inwide safety margins, or to adopt the course of conservatism in engineering design.Furthermore, decisions were constantly required under circumstances of imperfectknowledge. Under such prototyping conditions acceptable design limits <strong>and</strong> operationscannot be fully proceduralised in advance, <strong>and</strong> while there were rules designed to highlightareas of ongoing safety concern (of which the O-rings were one) there was also an elaboratesystem of other st<strong>and</strong>ard procedures at NASA that permitted ‘violations’ when necessary.Accordingly, the conventional hindsight interpretation of a rule violation as a wilfultransgression of set administrative procedures or design criteria is not useful. Under suchcircumstances what is to be regarded as a risk – <strong>and</strong> in particular what is judged anacceptable risk – becomes a question of social negotiation (see also Short <strong>and</strong> Clark 1993).55


Behavioural Research in Road <strong>Safety</strong>: 10th SeminarIn the case of the Shuttle the warnings of danger, which were recognised <strong>and</strong> taken veryseriously by many of those involved, were rarely equivocal: some were weak in theirimplications; others gave mixed messages about safety; <strong>and</strong> still others were embedded inconsiderable noise. Vaughan’s account shows how the significance of the available warningsfor flight safety, <strong>and</strong> the norms though which the risk was judged, were accordinglycontinually negotiated <strong>and</strong> re-negotiated through the working practices of the teams ofengineers. This worked well in resolving many of the safety problems with the Shuttle, butfor the O-ring seals a cycle of decision making was set in motion where signals of the‘deviant’ behaviour of the system were successively ‘normalised’ as acceptable through thest<strong>and</strong>ard process of risk assessment institutionalised in the working practices of theorganisations. This process Vaughan terms the ‘normalisation of deviance’ <strong>and</strong> it wasrepeated throughout the history of the joint problems. By deconstructing the notions of‘safety norm’ <strong>and</strong> ‘violation’, it suggests also that attempts to investigate the micro-world oforganisational risk perception <strong>and</strong> decision making might take account of the dynamic waysin which risk is constructed by workgroups.Organisational sub-<strong>culture</strong>s<strong>Safety</strong> <strong>culture</strong> is often discussed as if it is a unified property that characterises anorganisation throughout. However, any large organisation, or set of co-operatingorganisations, will comprise different social sub-groupings (drivers <strong>and</strong> maintenance,production <strong>and</strong> managers, permanent staff <strong>and</strong> sub-contractors, site A compared to site B).For example, in aviation such sub-groupings are often associated with very differentattitudes towards safety (O’Leary <strong>and</strong> Pidgeon, 1994). Evidence for the existence of safetysub-<strong>culture</strong>s comes from studies in the workplace by Sinclair <strong>and</strong> Haines (1993), Gherardi,Nicolini <strong>and</strong> Odella (1996) <strong>and</strong> by Mearns et al (1998). The latter authors, in a large surveyof safety climate in offshore oil installations, note significant differences in attitudestowards safety between staff at different levels of the organisations studied (eg managers vsworkers), while far less differences between operators who were contractors compared tothose who were permanent staff. This latter finding might be explained by professional sub<strong>culture</strong>sshared across the industry (ie attached to specific roles or levels of training, <strong>and</strong>irrespective of the organisation you actually work for) or the influence of the safety sub<strong>culture</strong>son a particular platform in moulding the attitudes of everybody working there.They speculate that the absence of a unitary organisational <strong>culture</strong> may not necessarily be abad thing as ‘it may always require some dissenting voices in the crowd to continuallyimprove the state of safety’. In theoretical terms different sub-<strong>culture</strong>s might indeed be auseful counter to the dangers of corporate myopia (Pidgeon, 1998).Organisational learningThere are very few contemporary treatments of safety <strong>culture</strong> that do not accordconsiderable importance to the goal of learning from incidents, accidents <strong>and</strong> other relevantexperience through such things as monitoring, incident analysis <strong>and</strong> feedback systems, asone part of an integrated safety management system (eg ACSNI 1993; Toft <strong>and</strong> Reynolds,1997). Curiously, the related questions of organisational politics <strong>and</strong> power are absent inmost theoretical models of organisational failures <strong>and</strong> many discussions of safety <strong>culture</strong>stoo. We would argue that organisational politics <strong>and</strong> power are crucial for determiningwhether a number of the goals of a safety <strong>culture</strong> will be met, <strong>and</strong> in particular that oforganisational learning (see Pidgeon, 1997).56


<strong>Safety</strong> <strong>culture</strong>: <strong>transferring</strong> <strong>theory</strong> <strong>and</strong> <strong>evidence</strong> from the major hazards industriesWhat seems to lie <strong>and</strong> at the heart of this issue is the institutional dilemma of blame.Danger <strong>and</strong> blame have been ubiquitous features of societies over the years, as one means ofdefending favoured institutional arrangements, <strong>and</strong> for this reason they underpin manycontemporary discussions of risk <strong>and</strong> safety too. Blame does of course bring positive, as wellas negative, possibilities for safety. In so doing it presents a dilemma of design.Responsibility without accountability for the worst excesses of accidents <strong>and</strong> disasters mayfail to motivate representatives of organisations to act in good faith. On the other h<strong>and</strong>,seeking a ‘culprit’ whenever an error has occurred may promote the avoidance of blamerather than critique <strong>and</strong> honesty. Hence, efforts to motivate people to act safely throughsanction may be self-defeating, resulting in the very state of poor or incomplete informationwhich is a pre-condition to system vulnerability.If politics <strong>and</strong> blame do corrupt the possibilities for organisational learning then surely themost important challenge resides here: in the ways by which such processes might becounteracted. More pointedly, we can ask what political, cultural, symbolic <strong>and</strong>institutional arrangement support the generation of organisational intelligence overcorporate myopia? As La Porte <strong>and</strong> Rochlin (1994) point out, the question then becomesnot, did the organisation learn everything it could from a specific incident or accident (forvery few will), but does an organisation recognise a need to h<strong>and</strong>le safety intelligence betterin the future?The obstacles which blame <strong>and</strong> organisational politics place in the way of learning cannotbe expected to yield to simple solutions. In some high-risk contexts solutions evolved overtime, as with civilian aviation incident reporting <strong>and</strong> monitoring (see Pidgeon <strong>and</strong> O’Leary,2000). Such monitoring (of, for example, air near-misses) is held by many to make a positivecontribution to collective learning in the industry, <strong>and</strong> through this to safety (see Chappell,1994). The question of how a reporting or monitoring system can be successfully embeddedwithin the local social <strong>and</strong> political contexts (sometimes both organisational <strong>and</strong> national)where it will be expected to operate is invariably not posed, <strong>and</strong> inattention to such issues isa major factor when such systems do not work as intended, or do not work at all.Concluding comments: <strong>culture</strong> <strong>and</strong>fleet driver behaviourThe study of safety <strong>culture</strong>s <strong>and</strong> climates presents a relatively new <strong>and</strong> exp<strong>and</strong>ing area ofresearch activity. Although safety <strong>culture</strong> has not yet been extensively studied in thecontext of driver behaviour, possible avenues for future research are suggested in relation tofleet safety <strong>and</strong> the transferability of methods <strong>and</strong> findings from the high hazard context.For example, we might ask whether a significant proportion of the variance in fleet safety isindeed attached to the organisational unit of analysis (as experience in other industriessuggests this will prove to be so)? It would be easy to claim that tools, results <strong>and</strong> findingsmight be readily transferred to the domain of road fleet safety. Experience suggests thatwhile many of the broad principles will find applicability in this new domain, contextualdifferences will inevitably require research instruments that are closely attuned to the fleetsafety domain. Matters are not helped here because, as Flin (1998) rightly points out, thereare a variety of customised safety <strong>culture</strong> measures now on offer, grounded in different57


Behavioural Research in Road <strong>Safety</strong>: 10th Seminarindustrial contexts, <strong>and</strong> with few attempts to reconcile underlying frameworks or toconduct meta-analyses across (often) confidential data bases. Accordingly, a combinationof qualitative <strong>and</strong> quantitative research might be needed first to identify the importantcontext-specific issues in relation to fleet driver safety.However, some lessons are suggested by our review of existing research on safety <strong>culture</strong>.Inevitably these issues interact with one another, but a first concerns the unit of analysisadopted. Driver behaviour research <strong>and</strong> intervention tends, for obvious reasons, to focuseither upon the individual driver or engineered solutions that change the driver’senvironment. In considering fleet safety the organisation should be considered as much apart of the ‘environment’ as is the physical layout of roads. Accordingly, attention shouldfocus not only upon the individual driver, but also the attitudes of company management<strong>and</strong> line supervisors (who will influence driver behaviour through their own commitmentto safety). An hypothesis here would be that changes in fleet safety attitudes are alreadyunderway in companies <strong>and</strong> industries where safety has become a key business priority, orwill be easy to instigate in such companies. Likewise, the issue of occupational sub-<strong>culture</strong>shighlights the potential for training to cover some of the softer issues, such as attitudestowards violations <strong>and</strong> the balance to be struck between performance <strong>and</strong> safety. Finally,mechanisms for organisational learning, such as confidential incident reporting systems,have begun to be taken up in a variety of industries. While such systems are unlikely to beappropriate to every fleet driver context, <strong>and</strong> are certainly very difficult to instigate(Pidgeon, 1997), with larger organisations they might bring considerable benefits as well ashelping to underline the wider commitment to safety both across industries <strong>and</strong> withinorganisations that is a part of an effective safety <strong>culture</strong>.ReferencesACSNI (1993) Advisory Committee on the <strong>Safety</strong> of Nuclear Installations: Study Group onHuman Factors. Organizing for <strong>Safety</strong>. Health <strong>and</strong> <strong>Safety</strong> Commission, London.Chappell, S.L. (1994) Using voluntary incident reports for human factors evaluations. InN.A Johnston, N. McDonald <strong>and</strong> R. Fuller (eds.), Aviation Psychology in Practice. AveburyTechnical Press, Aldershot.Cheyne, A., Cox, S., Oliver, A. <strong>and</strong> Tomas, J.M. (1998) Modelling safety climate in theprediction of levels of safety activity. Work <strong>and</strong> Stress, 12, 255-271.Cohen, A. (1977) Factors in successful occupational safety programs. Journal of <strong>Safety</strong>Research, 9(4), 168-178.Cox, S. <strong>and</strong> Flin, R. (1998) <strong>Safety</strong> <strong>culture</strong>: philosopher’s stone or man of straw? Work <strong>and</strong>Stress, 12, 189-201.Flin, R. (1998) <strong>Safety</strong> condition monitoring: lessons from man-made disasters, Journal ofContingencies <strong>and</strong> Crisis Management, 6(2), 88-92.Flin, R., Mearns, K., Fleming, M. <strong>and</strong> Gordon, R. (1996) Risk perception <strong>and</strong> safety inoffshore workers. <strong>Safety</strong> Science, 22, 131-145.Free, R. (1994) The Role of Procedural Violations in Railway Accidents. PhD Thesis,University of Manchester.58


<strong>Safety</strong> <strong>culture</strong>: <strong>transferring</strong> <strong>theory</strong> <strong>and</strong> <strong>evidence</strong> from the major hazards industriesGephart, R.P. Jnr. (1984) Making sense of organisationally based environmental disasters,Journal of Management, 10(2), 205-225.Gherardi, S., Nicolini, D. <strong>and</strong> Odella, F. (1996) What do you mean by safety? Conflictingperspectives on accident causation <strong>and</strong> safety management inside a construction firm.Unpublished manuscript, Department of Sociology <strong>and</strong> Social Research, University ofTrento, Italy.Guest, D. (1994) <strong>Safety</strong> <strong>culture</strong> <strong>and</strong> safety performance: British Rail in the aftermath of theClapham Junction disaster. Department of Occupational Psychology, Birkbeck College.Guldenmund, F. (2000) <strong>Safety</strong> <strong>culture</strong>: a review. <strong>Safety</strong> Sciences, in press.Health <strong>and</strong> <strong>Safety</strong> Executive (1997) <strong>Safety</strong> Climate Measurement Tool. London, HSE Books.Hudson, P., Verschuur, W.L., Lawton, R., Parker, D. <strong>and</strong> Reason, J.T. (1997) Bending theRules II. The Violations Manual. Rijks Universiteit, Leiden.Kennedy, R. <strong>and</strong> Kirwan, B. (1995) The failure mechanisms of safety <strong>culture</strong>. In A.Carnino, A. <strong>and</strong> G. Weimann (eds.), Proceedings of the International Topical Meeting on<strong>Safety</strong> Culture in Nuclear Installations. American Nuclear Society of Austria, Vienna.La Porte, T.R. <strong>and</strong> Rochlin, G.I., (1994) A rejoinder to Perrow, Journal of Contingencies <strong>and</strong>Crisis Management, 2(4), 221-227.Mearns, K., Flin, R., Gordon, R. <strong>and</strong> Fleming, M. (1998) Measuring safety climate inoffshore installations. Work <strong>and</strong> Stress, 12, 238-254.OECD Nuclear Agency (1987) Chernobyl <strong>and</strong> the <strong>Safety</strong> of Nuclear Reactors in OECDCountries. Organisation for Economic Co-operation <strong>and</strong> Development, Paris.Perrow, C. (1984) Normal Accidents. Basic Books, New York.Pidgeon, N.F. (1991) <strong>Safety</strong> <strong>culture</strong> <strong>and</strong> risk management in organisations, Journal of Cross-Cultural Psychology, 22(1), 129-140.Pidgeon, N.F. (1997) The limits to safety? Culture, politics, learning, <strong>and</strong> man-madedisasters, Journal of Contingencies <strong>and</strong> Crisis Management, 5(1), 1-14.Pidgeon, N.F. (1998) <strong>Safety</strong> <strong>culture</strong>: key theoretical issues. Work <strong>and</strong> Stress, 12, 202-216.Pidgeon, N.F. <strong>and</strong> O’Leary, M. (1994) Organisational safety <strong>culture</strong>: implications foraviation practice. In N.A. Johnston, N. McDonald, <strong>and</strong> R. Fuller (eds.), Aviation Psychologyin Practice Avebury Technical Press, Aldershot.Pidgeon, N.F. <strong>and</strong> O’Leary, M. (2000) Man-made disasters: or why technology <strong>and</strong>organisations (sometimes) fail. <strong>Safety</strong> Sciences, in press.Reason, J.T (1990) Human Error. Cambridge University Press, Cambridge.Reason, J.T. (1997) Organizational Accidents. Ashgate, Aldershot.Reason, J.T., Parker, D. <strong>and</strong> Free, R. (1994) Bending the Rules. The Varieties, Origins <strong>and</strong>management of <strong>Safety</strong> Violations. Rijks Universiteit, Leiden.Schein, E.H. (1985) Organizational Culture <strong>and</strong> Leadership. Jossey-Bass, San Francisco.Short, J.F. <strong>and</strong> Clark, L. (1993) Organizations, Uncertainties <strong>and</strong> Risk Westview Press,Boulder Co.Sinclair, A. <strong>and</strong> Haines, F. (1993) Deaths in the workplace <strong>and</strong> the dynamics of response,Journal of Contingencies <strong>and</strong> Crisis Management, 1(3), 125-137.59


Behavioural Research in Road <strong>Safety</strong>: 10th SeminarToft, B. <strong>and</strong> Reynolds, S. (1997) Learning From Disasters. Perpetuity Press, Leicester.Turner, B.A. (1976) The organisational <strong>and</strong> interorganisational development of disasters,Administrative Science Quarterly, 21, 378-397.Turner, B.A. (1978) Man-Made Disasters. Wykeham Science Press, London.Turner, B.A., 1991, The development of a safety <strong>culture</strong>, Chemistry <strong>and</strong> Industry, 1, April1991, 241-243.Turner, B.A. <strong>and</strong> Pidgeon, N.F. (1997) Man-Made Disasters, 2nd edn. ButterworthHeinemann, Oxford.Turner, B.A., Pidgeon, N.F., Blockley, D.I. <strong>and</strong> Toft, B. (1989) <strong>Safety</strong> <strong>culture</strong>: its importancein future risk management. Position paper for Second World Bank Workshop on <strong>Safety</strong> Control<strong>and</strong> Risk Management. Karlstad, Sweden, November.Vaughan, D. (1996) The Challenger Launch Decision: Risky Technology, Culture, <strong>and</strong> Devianceat NASA.Chicago University Press, Chicago.Zohar, D. (1980) <strong>Safety</strong> climate in industrial organisations: Theoretical <strong>and</strong> appliedimplications. Journal of Applied Psychology, 65(1), 96-102.60

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!