managing proneness to failure

4
Managing Proneness to Failure David Blockley* Introduction The over-whelming majority of accidents and disasters do not ‘just happen’. Even those that seem, on the face of it, to be ‘bolts from the blue’, rarely are. This was the clear message of Barry Turner’s first edition of his book in 1978 and it is a message that still needs to be heard today. Of course, engineers and practical decision makers are now very skilled at anticipating and dealing with foreseen risks – but Turner was not so concerned about those. He was concerned with some of the subtle and difficult to identify non-causal human and organizational reasons that, he discovered, lie at the heart of major disasters. The central issue now, as Nick Pidgeon quite rightly points out in his final chapter of this new edition of Man Made Disasters (1997), is whether these subtle factors can be predicted. One is concerned, first, to know whether one can foresee these factors as potential threats and, then, whether one can develop the predictive capability to recognize the relationships between them that will lead to failure. If this cannot be done, then, is it sufficient to simply identify them as they develop and then manage them away? Techniques to audit safety exist, but they need to be developed extensively to deal with the complex issues that Turner (1978) identified. Pidgeon (Turner and Pidgeon, 1997), in the new last chapter, is hesitant on this issue. However, this author believes that the way in which Turner approached this book and the rest of his work, using a systems approach, means that he believed that such a systematic search for the complex pre-conditions to disaster should be the end-goal of the work that he began. The Systems Approach There are three basic ideas that this paper seeks to stress, initially, and these underpin the systems approach. They are process, emergent properties and connectivity. It is clear and obvious that Turner (1978) thought in terms of all three. The book (1978; 1997) has many descriptions and diagrams of process. He described seven stages in the development of a man-made disaster. He called Stage I the stage of initial beliefs and norms, rather analogous to Kuhn’s (1970) period of normal science. Stage II he saw as the incubation period, where he identified several common features across a number of accidents. In fact, he saw this as a rich ground for safety auditing and management which is exactly where the sys- tematic search for the subtle and complex reasons for failure should occur. Stage III contains the precipitating event or trigger event which is so often confused as the ‘cause’ of the disaster. Stages IV and V, he described as onset, rescue and salvage, followed by Stage VI where there is cultural re-adjustment. The second systems idea is that all concepts are both a whole and a part. Koestler (1975) called such concepts ‘holons’. An emergent property is one that pertains to the whole and not to the parts it emerges from the interactions of the parts and it is, in this sense, that the whole is more than the sum of its parts. Prior to the book, disaster studies were usually only concerned with the awareness and activities of victims or potential victims in the period before disaster. A major contribution that Turner (1978) made was to use grounded theory to identify emergent properties, the major one of which was the Stage II, ‘incubation period’. This was a new idea which Turner identified as emerging from the pre-conditions to disaster, accident or failure. Other authors, such as Pugsley (1969), had identified this idea too, not in such detail and not in sociological terms, but as a ‘proneness to failure’ which is an extra to the technical reliability and risk calculations that engineers know and love. The third systems idea is that of connectivity. Complexity can result from many simple inter- acting processes which are highly connected. Loops of influence sometimes can lead to unexpected and counter-intuitive results. Turner saw the importance of social, rather than individual psychological factors and the need to see the relationship between individual and organizational decision making. Connectivity is at the heart of the incubation process. As *University of Bristol, Faculty of Engineering, Queens Build- ing, University Walk, Bristol BS8 1TR, United Kingdom. Volume 6 Number 2 June 1998 ß Blackwell Publishers Ltd 1998, 108 Cowley Road, Oxford OX4 1JF, UK and 350 Main Street, Malden, MA 02148, USA. 76 JOURNAL OF CONTINGENCIES AND CRISIS MANAGEMENT

Upload: david-blockley

Post on 15-Jul-2016

213 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Managing Proneness to Failure

Managing Proneness to Failure

David Blockley*

Introduction

The over-whelming majority of accidents anddisasters do not `just happen'. Even those thatseem, on the face of it, to be `bolts from theblue', rarely are. This was the clear message ofBarry Turner's first edition of his book in 1978and it is a message that still needs to be heardtoday.Of course, engineers and practical decision

makers are now very skilled at anticipating anddealing with foreseen risks ± but Turner was notso concerned about those. He was concernedwith some of the subtle and difficult to identifynon-causal human and organizational reasonsthat, he discovered, lie at the heart of majordisasters.The central issue now, as Nick Pidgeon quite

rightly points out in his final chapter of this newedition ofMan Made Disasters (1997), is whetherthese subtle factors can be predicted. One isconcerned, first, to know whether one canforesee these factors as potential threats and,then, whether one can develop the predictivecapability to recognize the relationships betweenthem that will lead to failure. If this cannot bedone, then, is it sufficient to simply identify themas they develop and then manage them away?Techniques to audit safety exist, but they need

to be developed extensively to deal with thecomplex issues that Turner (1978) identified.Pidgeon (Turner and Pidgeon, 1997), in the newlast chapter, is hesitant on this issue. However,this author believes that the way in whichTurner approached this book and the rest of hiswork, using a systems approach, means that hebelieved that such a systematic search for thecomplex pre-conditions to disaster should be theend-goal of the work that he began.

The Systems Approach

There are three basic ideas that this paper seeksto stress, initially, and these underpin thesystems approach. They are process, emergentproperties and connectivity.It is clear and obvious that Turner (1978)

thought in terms of all three. The book (1978;

1997) has many descriptions and diagrams ofprocess. He described seven stages in thedevelopment of a man-made disaster. He calledStage I the stage of initial beliefs and norms,rather analogous to Kuhn's (1970) period ofnormal science. Stage II he saw as the incubationperiod, where he identified several commonfeatures across a number of accidents. In fact, hesaw this as a rich ground for safety auditing andmanagement which is exactly where the sys-tematic search for the subtle and complexreasons for failure should occur. Stage IIIcontains the precipitating event or trigger eventwhich is so often confused as the `cause' of thedisaster. Stages IV and V, he described as onset,rescue and salvage, followed by Stage VI wherethere is cultural re-adjustment.The second systems idea is that all concepts

are both a whole and a part. Koestler (1975)called such concepts `holons'. An emergentproperty is one that pertains to the whole andnot to the parts ± it emerges from theinteractions of the parts and it is, in this sense,that the whole is more than the sum of its parts.Prior to the book, disaster studies were usually

only concerned with the awareness and activitiesof victims or potential victims in the periodbefore disaster. A major contribution that Turner(1978) made was to use grounded theory toidentify emergent properties, the major one ofwhich was the Stage II, `incubation period'. Thiswas a new idea which Turner identified asemerging from the pre-conditions to disaster,accident or failure. Other authors, such asPugsley (1969), had identified this idea too, notin such detail and not in sociological terms, butas a `proneness to failure' which is an extra to thetechnical reliability and risk calculations thatengineers know and love.The third systems idea is that of connectivity.

Complexity can result from many simple inter-acting processes which are highly connected.Loops of influence sometimes can lead tounexpected and counter-intuitive results. Turnersaw the importance of social, rather thanindividual psychological factors and the needto see the relationship between individual andorganizational decision making. Connectivity isat the heart of the incubation process. As

*University of Bristol, Facultyof Engineering, Queens Build-ing, University Walk, BristolBS8 1TR, United Kingdom.

Volume 6 Number 2 June 1998 ß Blackwell Publishers Ltd 1998, 108 Cowley Road, Oxford OX4 1JF, UK and350 Main Street, Malden, MA 02148, USA.

76 JOURNAL OF CONTINGENCIES AND CRISIS MANAGEMENT

Page 2: Managing Proneness to Failure

summarized by Pidgeon, in Chapter 11, Perrow(1984) used this idea with the concepts of tightand loose coupling and complexity to character-ize a system before failure.

Limits to Predictability

So, can accidents be predicted? It is known thatthere are limits to predictability. It is simplystated by the idea of incompleteness ± how canone know what one does not know? Platounderstood that ± unfortunately many modernscientists and engineers do not. Engineers andscientists have tended to use only one-valuesystems to examine their theoretical models.They have been intellectually concerned withprecise truth for predictability. In practice,however, they are concerned with dependableevidence in order to demonstrate responsibleduty of care and fitness for purpose. This haslead to a severe tension which is largelyunrecognized, but which is manifest in a varietyof ways (Blockley, 1980).Clearly if something is true and can be used to

predict, then, that prediction is dependable.Unfortunately, all theories break-down at somelimit so that there are circumstances for eachtheory where the predictions are undependable.The way in which theories and theoreticalmodels are used and how fit they are for thepurpose for which they are intended, is still notwell developed, even today.Thus it is easy for practical decision makers to

reject Turner's (1978) descriptors of the incuba-tion period as not causal, rather un-specific and,therefore, generally unsatisfactory. However,through the development of Chaos Theory andthe beginnings of a better understanding of thenature of uncertainty, it is apparent that thatwould be most un-wise. Practitioners are used tousing common sense, particularly when scientificknowledge falls far short of providing a solutionto a given problem. Most of Turner's (1978)characteristics of the incubation period arecommon sense when pointed out ± that is, theelegance, essential simplicity and power of theidea.Engineers often think in pictures and diagrams.

The systems idea of the `rich picture' was familiarto Turner. The event-sequence diagrams aresimple process models and his use of these ideasprobably stemmed from his engineering inclina-tions. In fact, the book is liberally sprinkled withdiagrams and tables to illustrate this mode ofthinking. However, there is a strange tension inthe book between an emphasis on understanding(the social scientist in him) and the use of theideas to prevent accidents which this authorknows was at the heart of what he wanted toachieve (the systems engineer in him).

He knew that the disaster process had to beactively managed and that prediction had to bepart of that process. It is perhaps strange,therefore, that prediction received scant atten-tion in the book, probably because none of theattributes he identified are causal ± rather theyare features of the pre-conditions to failure.

Uncertainty

The book has a pragmatic quality in that it isgrounded on case histories. This is followed bytheorizing and philosophizing about boundedrationality and the nature of information andprocess which is then followed by furtherexamples to ground the ideas.There is a clear notion that it is the unintended

consequences of human action that have to bemanaged. Pidgeon (Turner and Pidgeon, 1997)refers, in Chapter 11, to a case history which isexplained in these terms. A series of failures offactory roofs of a certain type under snow-loading were not the result of any lack of duty ofcare or negligence. They were simply theemergent result of a series of interactionsbetween the decisions made about the designof particular cold-formed steel purlins (roof-beams) and the British Standard for thespecification of snow-loading on factory roofs.One of the central recommendations of thepaper published in the Journal of the Institution ofStructural Engineers was that engineering de-signers need to develop a habit of searching forthe unintended consequences of their decisions(Pidgeon, Blockley and Turner, 1986). More thanone engineer has since received that notion withincredulity arguing that if something is unin-tended then by definition it is unimagined.However, it is central to a modern strategy ofpreventing accidents and disasters that Turner(1978) foresaw that a regular scan and `brain-storming' for these unintended consequenceswas important and crucial in some instances.Unintended consequences produced within

organizational settings propagate non-randomlythrough the rules of the organization. Theunintended errors at the top of an organisationpropagate further and have more consequencesthan those lower down. Failure of foresight candevelop from the interactions of a number ofgroups and individuals. Variable disjunction ofinformation exists when, in a complex-situation,a number of parties are unable to obtain thesame information so that many different storiesor interpretations exist. The idea of boundedrationality is important, therefore, in exploringthe nature of unintended consequences and insearching for the origins of disasters.Turner appreciated the limits to engineering

knowledge and that engineers have to deal with

MANAGING PRONENESS TO FAILURE 77

ß Blackwell Publishers Ltd 1998 Volume 6 Number 2 June 1998

Page 3: Managing Proneness to Failure

incomplete, fuzzy and random data. However, heseemed to give scant attention to the technicalside of the social technical interface. He agreedwith Polanyi, who noted that people seem toknow more than they can articulate ± they havetacit knowledge. His use of grounded theory wasbased on the idea that people find it easier to talkthrough examples rather than articulating theirideas directly.Rational plans for action do not provide

perfect guides. They may be based on inade-quate fore-knowledge of the pressures likely toaffect the proposed actions or of the inter-relationships between these pressures. Theindividuals or groups concerned may be unableto collect and process the information required.There is, ultimately, no way in which unintendedconsequences can be avoided with certainty; thepotential for disaster is always with us.Thus another central message of Man-Made

Disasters is that safety, risk and disaster manage-ment require inter-disciplinary thinking andinter-disciplinary skills so that many points ofview are taken into account. That implies awillingness to consider another's point of viewwhen, at first sight, one may think it, at best,irrelevant and, at worst, down-right mis-leading.The central messages are:

● That a way of seeing is also a way of notseeing ± practical decision makers can beprone to rigidity of perception and belief;

● that organizational exclusivity is dangerous ±organizations that are `high handed' withwarnings from those outside and feel thatthey, as experts, know better, are heading fortrouble;

● that difficulties with information need to berecognized ± if information is not examinedcreatively, then difficulties may be buried in amass of information; information may bepresented only at moment of crisis and infor-mation may be received in a passive way;

● that care needs to be taken with `strangers' ±that is, people who are difficult to brief abouta hazard, perhaps because they come fromoff-site;

● that failure to comply with regulationsalready in existence is dangerous; and

● that there is a tendency to minimize emergentdanger ± which is why engineers try tomeasure such things where possible in orderto recognize that development.

The individual decision maker can attempt tobehave rationally, by searching for courses ofaction leading to `satisfactory' outcomes. Thereis no way in which the individual decision makercan guarantee a `best possible' outcome, unlesshe or she is acting within the limits of a small,closed-system about which perfect knowledgecan be gained. Even then, since all closed-

systems have no scope for coping withunintended consequences any real decisionshave to deal with open-systems.

Energy and Information

Accidents are a release of energy in anunintended and uncontrolled way. Again, Turnerdemonstrates an engineering world-view. He issearching for an understanding and doubtingwhether disasters can be prevented. However, ifonly one accident is prevented, then thetheorizing is justified, although, of course, onemay never know whether that is so.Turner (1978) understood the point that

concentrations of energy, organizational powerand populations are themselves pre-conditionsto disaster. He defined, rather unusually, thatdisasters are energy plus mis-information. It hastaken this author some time to realise the fullincisiveness of that simple statement. After adisaster, the world is transformed in un-anticipated ways ± but not only is the worldtransformed but also our understanding of it haschanged. Turner saw disasters as mis-placedenergy, just as noise is mis-placed music and mis-placed flowers are weeds. One, therefore, needsto know for whom these things are mis-placed ±and there may be disagreement about it.There is, strangely, in the book no discussion

of the importance of measurement, despitereferences to probability. There is no explicitmention of the interaction between his concernsand those of engineering science models wheremeasurement is so important. Given his concernthat disasters occur in a socio-technical context,he does not discuss the importance of thedifference between social scientific, scientific andtested technical information. The inclusion of alengthy discussion on Information Theoryshould have prompted a comparison withthermodynamic and other engineering scientificmodelling. After discussing it, Turner (1978) sawhow restricted Information Theory was.1 Thisneed for measurement lies, of course, at the heartof the development of any predictive capabilityfor connecting the complex pre-conditions thatTurner (1978) identified as, in some sense,`causing' failure ± whether the measurements,and hence the predictive models, are determinis-tic or statistical.There is, in the book, a clear view of the

systems idea that information could be arrangedinto levels in a hierarchy ± although it seems itwas not used explicitly. He understood thedistinction between open- and closed-worldmodels and had a clear appreciation of theimportance of surprize when interpreting data.Unexpected events were classified into

anomalies, serendipity and catastrophes. The

78 JOURNAL OF CONTINGENCIES AND CRISIS MANAGEMENT

Volume 6 Number 2 June 1998 ß Blackwell Publishers Ltd 1998

Page 4: Managing Proneness to Failure

occurrence of these three does not reduceuncertainty or, at least, not immediately ± ratherthey are interesting, favourable or unfavourableevents.An important notion is the way in which error

is bound up in organizational culture (proceduresand habits). An interesting example of this iswhether the peer-group would have done thesame thing as the decision maker, so there is nosense of negligence ± but, still, an error occurrs.This is crucial for an understanding of whattechnology can offer but, more importantly,what it cannot offer ± that things can go wrongeven when someone is not negligent. Turnerappreciated that then ± many people do not,even now.He also appreciated that good disaster

management needs good general management.He emphasized the need for agreed values anddirection, an appreciation of each other's `world-view' with plenty of discussion and communica-tion. He saw that the need to harmonize thenecessary diversity of individuals that make upan organization is crucial. The essential tensionbetween the need for commonality of thoughtfor organizational success and for a diversity ofviews for richness of development and copingwith change is the essential problem of modernuniversities for example. This difference is theessence of preventing surprizes and, hence,accidents.Turner (1978) pointed out the stage of

organizational learning after a disaster andreferred to Ronan Point as an example concern-ing system-building. This was so, but, in fact, theimpact on the regulations for wind-loading waseven more extensive. This example was anapposite preparation for the studies that fol-lowed into the construction industry, as referredto by Pidgeon in Chapter 11 (Turner andPidgeon, 1997).In summary, there is an essential creative

tension at the heart of the book. It is, in thisauthor's view, the difference between the explicitapproach of a careful social scientist who is,perhaps, too cautious to admit to the engineer-ing systems approach that is within him.Pidgeon (Turner and Pidgeon, 1997) asks if the

theory of disasters can make predictions andprevent accidents. To this author, it is self-evident that even if the predictive power islimited, some disasters and accidents can beaverted using the ideas set out in the book. ToTurner (1978), it was difficult to prove and,hence, problematic. The essential differencebetween us is that he wanted proof but thisauthor merely wants to see continuous improve-ment over current practice.

Note

1. See page 118 line 21 of the book and discussion inchapters 8 and 10. Also from my privatediscussions with him and the fact that he didnot subsequently work on developing the ideas.Note we are talking only about the Shannoninformation theory view which is the basis ofmuch of the theory behind information andcommunications technology.

References

Blockley, D.I. (1980), The Nature of Structural Designand Safety, Ellis Horwood, Chichester.

Koestler, A. (1975), The Ghost in the Machine,Hutchinson, London.

Kuhn, T.S. (1962), The Structure of Scientific Revolu-tions, University of Chicago Press, Chicago.

Pidgeon, N.R., Blockley, D.I. and Turner, B.A. (1986),`Lessons from a Roof Collapse', Journal of theInstitution of Structural Engineers, Volume 64A,Number 3, March, pp. 67±71.

Perrow, C. (1984), Normal Accidents: Living with High-Risk Technologies, Basic Books, New York.

Polanyi, M. (1958), Personal Knowledge: Toward thePost Critical Philosophy, Routledge and Kegan Paul,London.

Pugsley, A.G. (1969), The Engineering Climatology ofStructural Accidents, International Conference onStructural Safety and Reliability, Washington, pp.335±340.

Turner, B.A. (1978), Man-Made Disasters, WykehamPublications, London.

Turner, B.A. and Pidgeon, N.F. (1997), Man-MadeDisasters (2nd edition), Butterworth Heinmann,Oxford.

MANAGING PRONENESS TO FAILURE 79

ß Blackwell Publishers Ltd 1998 Volume 6 Number 2 June 1998