2008 03 03: evacuation of sleipner oil & gas offshore installation · 1 jcs part ii: what is...

7
1 JCS part II: What is the difference between 120 alarms and 3 periscopes Cato A. Bjørkli Associate professor, UIO 2008 03 03: Evacuation of Sleipner Oil & Gas Offshore Installation 228 persons was evacuated from the Sleipner facility March the 3rd due to gas exposure The gas exposure was repaired, and the platform will return to normal production during today. Adresseavisen 2005.03.03: Sikten er begrenset. Dersom man kjører såkalt taktisk, skal føreren sitte nede i vogna med luka igjen. Da skjer styringen kun gjennom 3 periskop. Når man kjører på denne måten, setter det ekstremt høye krav til årvåkenhet hos føreren.” A police officer commenting the incident where a tank during tactical maneuvering accidentally drove over a civilian vehicle with two passengers still inside. And now, something completely else: JCS Aim of this lecture Probe deeper into the nature of JCS Discuss key issues and problems Discuss automation Discuss how to cope with automation in complex systems Background for this lecture Hollnagel & Woods (2006) Resilience Engineering … Dekker … Previous JCS lecture

Upload: others

Post on 13-Mar-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 2008 03 03: Evacuation of Sleipner Oil & Gas Offshore Installation · 1 JCS part II: What is the difference between 120 alarms and 3 periscopes Cato A. Bjørkli Associate professor,

1

JCS part II:What is the difference between120 alarms and 3 periscopes

Cato A. BjørkliAssociate professor, UIO

2008 03 03: Evacuation ofSleipner Oil & Gas Offshore

Installation

• 228 persons was evacuatedfrom the Sleipner facilityMarch the 3rd due to gasexposure

• The gas exposure wasrepaired, and the platformwill return to normalproduction during today.

Adresseavisen 2005.03.03:“Sikten er begrenset. Dersom man kjører såkalt taktisk, skalføreren sitte nede i vogna med luka igjen. Da skjer styringenkun gjennom 3 periskop. Når man kjører på denne måten,setter det ekstremt høye krav til årvåkenhet hos føreren.”

A police officercommenting the incidentwhere a tank duringtactical maneuveringaccidentally drove over acivilian vehicle with twopassengers still inside.

And now, something completely else:

JCS

Aim of this lecture

• Probe deeper into the nature of JCS• Discuss key issues and problems• Discuss automation• Discuss how to cope with automation in complex

systems

Background for this lecture

• Hollnagel & Woods (2006) Resilience Engineering …• Dekker …• Previous JCS lecture

Page 2: 2008 03 03: Evacuation of Sleipner Oil & Gas Offshore Installation · 1 JCS part II: What is the difference between 120 alarms and 3 periscopes Cato A. Bjørkli Associate professor,

2

-- scientific method --

a way of doing something,especially a systematic way;implies an orderlylogical arrangement(usually in fixed steps)

-- applied science --

ability to produce solutionsin some problem domain

approximate d(f) The Science ofMan-Machine Interaction (MMI)

What characterizes a good product?

What characterizes a safe system?

What characterizes an efficient system?

How do we know that somethingis what we designed it to be?

(examples?)

-- cognitive engineering ---

Cognitive Engineering (CE) is concerned withthe analysis, design, and evaluation of complex sociotechnical systems

The aim of cognitive engineering is to facilitate safe, productive and healthy work incomplex sociotechnical systems

(Vicente, 1999; Rasmussen et al, 1994).

approximate d(f)

CE vs Design vs Usability Engineering?

“Practical difficulties arise during the investigation And reporting of most accidents. These difficulties Include the determination of the scope of phenomenonto investigate, the identification of the data required (…) These difficulties reflect differences in in the purposes for the investigations, which in turn reflect different perceptions of the accident phenomenon”

Benner (1978)

Different views on accidents

• Simple Linear Accident Models– Strings of cause and effect unfolding over time– Domino-effect/model

• Complex Linear Model– Interrelations between ‘unsafe’ acts– Swiss Cheese Model

• Systemic Accident Models– Concurrence of variability– Erros and accidents are potentials ‘normal’ operations

Normal vs Abnormal

• What separates functional, successful variance fromthe malfunctional, dangerous variance?

• What do we call the quality of JCS to stayorganinsed and on-track despite disturbances andsurprises?

• What do we call the balancing of demands andresources?

Page 3: 2008 03 03: Evacuation of Sleipner Oil & Gas Offshore Installation · 1 JCS part II: What is the difference between 120 alarms and 3 periscopes Cato A. Bjørkli Associate professor,

3

JCS

• Perspective How do we understand the field?

• Theories How do we assume things work?

• Concepts What are the common concepts?

• Issues What kind problems do we solve?

• Method How do we solve problems?

JCS

• Perspective (insert your perspective here)

• Theories Human cognition and behavior (Info Pro?)

• Concepts Representation, automation, workload

• Issues Overload, bottlenecks, safety, interfaces

• Method Interviews, observations, experiments,

This is the Age ofInformational Soup

“Information is not a scarce resource. Attention is.”Herbert Simon, 1981

What does this imply?

Common features of complex systems (nuclear plants,aerospace, petrochemical) are the vast amount of infoavailable.

(big systems = a lot of information)

Furhter, the complex systems behave in a non-linear fashion(surprises! Ref: Casey; Perrow)

The Informational Soup

“Although all of the data was physical available, it was notoperationally effective” Joyce & Lapinski, 1983

(Observability is more than mere data availability - Woods et al, 2002)

Practitioners arebombarded with

information at theirworkspaces

Imagine theview for the

operator:

...control rooms, cockpits, tv production, stock markets ...

The Data Paradox

(for operators:)The more data that is available,

the harder it is to find thesignificance and relevance of it.

(general trend:)More and more data is available,

but our capacity to process it is the same. ... What does thismean for operators?

Page 4: 2008 03 03: Evacuation of Sleipner Oil & Gas Offshore Installation · 1 JCS part II: What is the difference between 120 alarms and 3 periscopes Cato A. Bjørkli Associate professor,

4

Problem formulation

“(CE) is concerned with the analysis, design,and evaluation of complex sociotechnical systemsand to facilitate safe, productive and healthy work”

Case: 27.12.1991 SAS Airlines Accident Gottora, Sweden Crashed 3 min after take-off No causulties Aircraft written off120 alarms in 30 seconds

Features of overload

• Overload is often imminentwhen anomalies occur, whencritical incidents happens(ref: data paradox)

• Overload may also occur inaspects of normal running ofthe system in question

CLAIM: Clutter and confusion are

failures of design, not attributes of information

Features of overload

1. Amount: There is too much information present for theoperator to handle

2. Relevance: There is trouble of deciding the relevance of agiven data set

3. Bottleneck/time: There is too little time to respond and relateto the data available

Ref: Woods et al, 2002

1) Amount: “Too Much!”

1) Amount: “Too Much!”

Breaking down datasets intosmaller chunks brings forththe issue of navigation and

criteria of chunking

How do we navigate?

2. Relevance: ? versus !

“Given the enormous amount of stuff, and some taskto be done using some of the stuff ... What is therelevant stuff for the task?” (Glymour, 1987)

What information is relevant?(Or: What constitutes relevance of information?)

How do humans know what’s relevant?(How many nuances do you see right now?)

(percept org, control of intention, nose for anomalies)

Page 5: 2008 03 03: Evacuation of Sleipner Oil & Gas Offshore Installation · 1 JCS part II: What is the difference between 120 alarms and 3 periscopes Cato A. Bjørkli Associate professor,

5

Examples

Driver Support System: Adaptive Front Lights

Nuclear Power Plants: Cooling Fluids (DuressSJ)

High Speed Crafts: Navigation

2. Relevance: ? vs ! Examples

3. Automation

• When there is too much information, and attention isscarce ... let the machines do it ...

• “Automate everything you technically can”Chapanis, 1970

We, as man-technology specialists, know:

... the positive sides?... the negative sides?

3. AutomationMyths: Substitution

3. AutomationMyths: Substitution

100 % 70 %

30 %

TaskDemands

HumanCapacity

AutomationCapacity

TaskDemands

Man -Technology

System

3. AutomationMyths: Qualitative Effects

“.. just add man to machine!”

BEFORE AFTER

Page 6: 2008 03 03: Evacuation of Sleipner Oil & Gas Offshore Installation · 1 JCS part II: What is the difference between 120 alarms and 3 periscopes Cato A. Bjørkli Associate professor,

6

Automation transform practice,and humans adapt to novelties

Practice Artefacts

Requirements

ConstraintsDesign represent beliefs

about how to do things, howhumans think and act.

How to dealwith ‘Too Much!’?

‘Too Much’ concerns the difference

between

functionality and availability

(What is the cure? CE?)

Tanks! revisited

Detour Question: Is there, in principle, a difference between toomuch information and too little?

(Is there a difference between 120 alarms and 3 periscopes?)

Info Soup with Humans

• Info Soup and the capacity of humans– Too much! (Amount)– Relevance? (Context-sensitivity)– Bottlenecks (Automation)

• What are our assets as humans?– Perceptual Capabilities (Gestalt, Ecology, particular)– Attentional Control (Physiology, Genetic Make-Up, Attunement)– Nose for Anomalies (Functional, Bateson, 1972, depart from ref)– Natural Teamplayer’s (Social Beings, action is not alone)

• Capabilities: Organize, Prioritize, Synthesize, Adapt

Effective Solutionsto Info Overload

... ‘Meaning’ is not INdata, but in therelationship betweendata and expertiseand surroundings

1. Organisation precedes selectivity (support attention control)

2. Positive selection enhance structures(facilitate processing, don’t inhibit)

3. Context sensitivity(maintain relationships)

4. Observability, not availability(Provide contextualized info)

5. Conceptual spaces(represent frame of reference)

Failure by Default

Universal Law:Failure is certain(In the end, things will always fail!)

Systems will malfunction and fall apart,despite our effort and expertise

To understand failure is similarto understanding success. (Heavily biased!)

Page 7: 2008 03 03: Evacuation of Sleipner Oil & Gas Offshore Installation · 1 JCS part II: What is the difference between 120 alarms and 3 periscopes Cato A. Bjørkli Associate professor,

7

Nine Steps ...

1. Pursue Second Stories (Wide scope, details, insignificance)2. Avoid Hindsight (Now vs Then: The Info Available)3. Work is the sharp end! (Work is particular, contextual, temp)4. System Weaknesses (Safety lies in systems, not components)5. Look for safety in praxis (Habits and skills are also barriers)6. Underlying patterns (Situation vs General Factors)7. Changes are multiple consequences (Systems are dynamic)8. How do technology support performance (Task / Artefact)9. Complexity and feedback (What is the technology doing?)

Questions?