taming prometheus: talk about safety and cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf ·...

32
Taming Prometheus: Talk About Safety and Culture Susan S. Silbey School of Humanities, Arts and Social Sciences, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139; email: [email protected] Annu. Rev. Sociol. 2009. 35:341–69 First published online as a Review in Advance on April 16, 2009 The Annual Review of Sociology is online at soc.annualreviews.org This article’s doi: 10.1146/annurev.soc.34.040507.134707 Copyright c 2009 by Annual Reviews. All rights reserved 0360-0572/09/0811-0341$20.00 Key Words technology, accidents, disasters, systems, management Abstract Talk of safety culture has emerged as a common trope in contempo- rary scholarship and popular media as an explanation for accidents and as a recipe for improvement in complex sociotechnical systems. Three conceptions of culture appear in talk about safety: culture as causal at- titude, culture as engineered organization, and culture as emergent and indeterminate. If we understand culture as sociologists and anthropol- ogists theorize as an indissoluble dialectic of system and practice, as both the product and context of social action, the first two perspectives deploying standard causal logics fail to provide persuasive accounts. Displaying affinities with individualist and reductionist epistemologies, safety culture is frequently operationalized in terms of the attitudes and behaviors of individual actors, often the lowest-level actors, with the least authority, in the organizational hierarchy. Sociological critiques claim that culture is emergent and indeterminate and cannot be in- strumentalized to prevent technological accidents. Research should ex- plore the features of complex systems that have been elided in the talk of safety culture: normative heterogeneity and conflict, inequalities in power and authority, and competing sets of legitimate interests within organizations. 341 Annu. Rev. Sociol. 2009.35:341-369. Downloaded from arjournals.annualreviews.org by MASSACHUSETTS INSTITUTE OF TECHNOLOGY on 09/10/09. For personal use only.

Upload: others

Post on 10-Mar-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Taming Prometheus: Talk About Safety and Cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf · 2009-10-27 · arship about safety culture. Since the 1990s, identifying broken or

ANRV381-SO35-17 ARI 5 June 2009 9:28

Taming Prometheus: TalkAbout Safety and CultureSusan S. SilbeySchool of Humanities, Arts and Social Sciences, Massachusetts Institute of Technology,Cambridge, Massachusetts 02139; email: [email protected]

Annu. Rev. Sociol. 2009. 35:341–69

First published online as a Review in Advance onApril 16, 2009

The Annual Review of Sociology is online atsoc.annualreviews.org

This article’s doi:10.1146/annurev.soc.34.040507.134707

Copyright c! 2009 by Annual Reviews.All rights reserved

0360-0572/09/0811-0341$20.00

Key Wordstechnology, accidents, disasters, systems, management

AbstractTalk of safety culture has emerged as a common trope in contempo-rary scholarship and popular media as an explanation for accidents andas a recipe for improvement in complex sociotechnical systems. Threeconceptions of culture appear in talk about safety: culture as causal at-titude, culture as engineered organization, and culture as emergent andindeterminate. If we understand culture as sociologists and anthropol-ogists theorize as an indissoluble dialectic of system and practice, asboth the product and context of social action, the first two perspectivesdeploying standard causal logics fail to provide persuasive accounts.Displaying affinities with individualist and reductionist epistemologies,safety culture is frequently operationalized in terms of the attitudes andbehaviors of individual actors, often the lowest-level actors, with theleast authority, in the organizational hierarchy. Sociological critiquesclaim that culture is emergent and indeterminate and cannot be in-strumentalized to prevent technological accidents. Research should ex-plore the features of complex systems that have been elided in the talkof safety culture: normative heterogeneity and conflict, inequalities inpower and authority, and competing sets of legitimate interests withinorganizations.

341

Ann

u. R

ev. S

ocio

l. 20

09.3

5:34

1-36

9. D

ownl

oade

d fro

m a

rjour

nals.

annu

alre

view

s.org

by M

ASS

ACH

USE

TTS

INST

ITU

TE O

F TE

CHN

OLO

GY

on

09/1

0/09

. For

per

sona

l use

onl

y.

Page 2: Taming Prometheus: Talk About Safety and Cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf · 2009-10-27 · arship about safety culture. Since the 1990s, identifying broken or

ANRV381-SO35-17 ARI 5 June 2009 9:28

“. . .the darkest and most treacherous of all thecountries . . . lie in the tropic between inten-tions and actions. . . .”

—Chabon (2008, p. 29)

Rescuing Prometheus, by the venerable histo-rian of technology Thomas Hughes (1998),describes how four large post–World War IIprojects revolutionized the aerospace, com-puting, and communication industries bytransforming bureaucratic organizations intopostmodern technological systems. In placeof centralized hierarchies of tightly coupledhomogeneous units typical of traditionalcorporate and military organizations, Hughesdescribes the invention of loosely couplednetworks of heterogeneously distributed andoften collegially connected communities ofdiverse participants. By the 1980s and 1990s,new modes of management and design—publicparticipation coupled with commitments to en-vironmental repair and protection—overcamewhat had been intensifying resistance tolarge-scale, often government sponsored, tech-nologies. “Prometheus the creator,” Hughes(1998, p. 14) writes, “once restrained by defenseprojects sharply focused upon technical andeconomic problems, is now free to embracethe messy environmental, political, and socialcomplexity of the postindustrial world.”

If the engineering accomplishments of thepast 40 years signify a resuscitated capacity tomobilize natural and human resources to pro-duce, distribute, and accumulate on historicallyunprecedented scales, proliferating interest insafety culture may signal renewed efforts totame Prometheus. In the past 20 years, a newway of talking about the consequences of com-plex organizations and sociotechnical systems1

has developed. Although culture is a commonsociological subject, those talking about safetyculture often invoke the iconic concept with

1“The notion of a sociotechnical system stresses the closeinterdependence of both the technological artifacts and be-havioral resources (individual, group, and organizational)necessary for the operation of any large-scale technology”(Pidgeon 1991, p. 131).

little of the theoretical edifice sociologists andanthropologists have built for cultural analy-sis. Decades after the social sciences reconcep-tualized culture as “the medium of lived ex-perience” ( Jacobs & Hanrahan 2005, p. 1), anormatively plural system of symbols andmeanings that both enables and constrains so-cial practice and action (Sewell 2005, pp. 152–75; Silbey 2001; 2005a, p. 343), the cultural turnhas taken root in the military and engineer-ing professions, and for similar reasons: humanaction and culture getting in the way of tech-nological efficiency. However, unlike the mil-itary’s embrace of culture where critique con-fronts its every move (Gusterson 2007), effortsto propagate safety culture in complex techno-logical systems proceed with scant attention toits ideological implications. Despite the appro-priation of the term culture, many advocatesand scholars of technological innovation andmanagement deploy distinctly instrumental andreductionist epistemologies antithetical to cul-tural analysis. We can be protected from theconsequences of our very effective instrumen-tal rationalist logics and safety can be achieved,they seem to suggest, by attending to what ad-vocates of safety culture treat as an ephemeralyet manageable residue of human intercourse—something akin to noise in the system. How arewe to understand this unexpected and unusualappropriation of the central term of the soft sci-ences by the experts of the hard, engineeringsciences?

This article reviews popular talk and schol-arship about safety culture. Since the 1990s,identifying broken or otherwise damaged safetyculture has become a familiar explanation fororganizational and technological failures. Al-though the term safety culture has been de-ployed across institutional sites and scholarlyfields, it is largely absent from sociologicalscholarship. Sociologists studying accidents anddisasters provide a more critical and skep-tical view of safety culture, if they addressit at all (e.g., Beamish 2002; Clarke 1989,1999, 2006; Gieryn & Figert 1990; Hilgartner1992; Perin 2005; Perrow 1999 [1984], 2007;Vaughan 1996). However, in engineering and

342 Silbey

Ann

u. R

ev. S

ocio

l. 20

09.3

5:34

1-36

9. D

ownl

oade

d fro

m a

rjour

nals.

annu

alre

view

s.org

by M

ASS

ACH

USE

TTS

INST

ITU

TE O

F TE

CHN

OLO

GY

on

09/1

0/09

. For

per

sona

l use

onl

y.

Page 3: Taming Prometheus: Talk About Safety and Cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf · 2009-10-27 · arship about safety culture. Since the 1990s, identifying broken or

ANRV381-SO35-17 ARI 5 June 2009 9:28

management scholarship, the term safety cul-ture is invoked with increasing frequency andseems to refer to a commonly shared, stable setof practices in which all members of an orga-nization learn from errors to minimize risk andmaximize safety in the performance of organi-zational tasks and the achievement of produc-tion goals.

In this review, I argue that the endorsementof safety culture can be usefully understood asa way of encouraging and allocating responsi-bility (Shamir 2008)—one response to the dan-gers of technological systems. Invoking cultureas both the explanation and remedy for techno-logical disasters obscures the different interestsand power relations enacted in complex organi-zations. Although it need not, talk about cultureoften focuses attention primarily on the low-level workers who become responsible, in thelast instance, for organizational consequences,including safety. Rather than forgoing partic-ularly dangerous technologies or doing less inorder to reduce vulnerabilities to natural, indus-trial, or terrorist catastrophes, talk about safetyculture reinforces investments in complex, hardto control systems as necessary and manageable,as well as highly profitable (for a few), althoughunavoidably and unfortunately dangerous (formany) (Perrow 2007). At the same time, talkof safety culture suggests that the risks asso-ciated with increased efficiency and profitabil-ity can be responsibly managed and contained.The literature on safety culture traces its prove-nance to the copious work on risk assessmentand systems analysis, system dynamics, and sys-tems engineering that became so prevalent overthe past 30 years.2 At the outset, paying atten-tion to culture seems an important and valu-able modification to what can be overly abstractand asocial theories of work and organization.Despite this important correction, research onsafety culture usually ignores the historical-political context, the structural relationships,

2Risk and systems analysis pervades contemporary organiza-tions from manufacturing, transportation, and communica-tions to finance, health, and education.

and the interdependencies that are essential tocultural and organizational performances andanalyses.

This review first provides a historical fram-ing for talk about safety culture because thatperspective is most clearly missing in much ofthe research. I suggest that talk about safetyculture emerges alongside market discoursethat successfully challenged the previous cen-turies’ mechanisms for distributing and mitigat-ing technological risks. In the second section, Idescribe the more than fourfold increase in ref-erences to safety culture that appeared in pop-ular and academic literature between 2000 and2007. Organizing the work in terms of threecommonly deployed conceptions, I then de-scribe culture as causal attitude, as engineeredorganization, and as emergent. Relying on aconception of culture as an indissoluble dialec-tic of system and practice, both a product andcontext of social action, I argue that the firsttwo perspectives not only fail to provide per-suasive accounts, but reproduce individualistand reductionist epistemologies that are un-able to reliably explain social or system perfor-mance. Although invocation of safety cultureseems to recognize and acknowledge systemicprocesses and effects, it is often conceptualizedto be measurable and malleable in terms of theattitudes and behaviors of individual actors, of-ten the lowest-level actors, with least authority,in the organizational hierarchy. The third cate-gory of culture as emergent and indeterminatecritiques claims that safety culture can be confi-dently instrumentalized to prevent catastrophicoutcomes from complex technologies. This sec-tion suggests that future research on safety incomplex systems should explore just those fea-tures of complex systems that are elided in thetalk of safety culture: normative heterogeneityand cultural conflict, competing sets of inter-ests within organizations, and inequalities inpower and authority. Rather than imagine com-plex yet homogeneous local cultures, researchshould explore how struggles among competinginterests are part of the processes of culturalproduction and how normative heterogene-ity, structured competition, and countervailing

www.annualreviews.org • Safety and Culture 343

Ann

u. R

ev. S

ocio

l. 20

09.3

5:34

1-36

9. D

ownl

oade

d fro

m a

rjour

nals.

annu

alre

view

s.org

by M

ASS

ACH

USE

TTS

INST

ITU

TE O

F TE

CHN

OLO

GY

on

09/1

0/09

. For

per

sona

l use

onl

y.

Page 4: Taming Prometheus: Talk About Safety and Cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf · 2009-10-27 · arship about safety culture. Since the 1990s, identifying broken or

ANRV381-SO35-17 ARI 5 June 2009 9:28

centers of power can contribute to, rather thanundermine, safer technologies.

HISTORICAL SHIFTS:CONSTRUCTING ANDDECONSTRUCTINGSAFETY NETSWhy has attention to safety culture arisen atthis historical moment?

Any answer must begin by acknowledg-ing the technological catastrophes of the past40 years: Three Mile Island, Bhopal, Cher-nobyl, the Challenger and Columbia accidentsat NASA, the Exxon Valdez oil spill, oil rigaccidents, Buffalo Creek, contaminated bloodtransfusions, and a host of less spectacular dis-asters (Ballard 1988; Davidson 1990; Erikson1978; Fortun 2001; Jasanoff 1994; Keeble 1991;Kurzman 1987; Medvedev 1992; Petryna 2002;Rees 1994; Setbon 1993; Stephens 1980; Stern1976/2008; Vaughan 1996, 2003, 2006; Walker2004).

In each instance, the accident was usuallyexplained as just that, an accident—not a sys-tem or design failure, but the result of someextraneous mistake or mismanagement of a ba-sically well-conceived technology. Because thesystems in which the accidents occurred are om-nipresent, the recurring accidents undermineconfidence that catastrophes can be avoided.Alongside concerns about genetically modifiedfoods, the toxicity of commonly used house-hold products, the migration of various syn-thetic compounds from plants through animalsinto the human body, the rapid spread of dis-ease and contamination through porous andswift global transportation routes, and human-produced environmental degradation, techno-logical accidents feed a deepening mistrust ofscience ( Jasanoff 2005). If, as Hughes (1998)suggests, the invention of postmodern systemsrescued Prometheus from the technologicaldisillusionment of the 1960s and 1970s, per-haps the promotion of safety culture respondsto a renewed technological skepticism in thetwenty-first century.

However, accidents alone cannot be drivingthe recent attention to safety culture. Techno-logical accidents are not new phenomena, andsafety has been a lively concern since the middleof the nineteenth century, if not earlier. Indeed,in some accounts, much of the regulatory appa-ratus of the modern state was institutionalizedto protect against the injurious consequences ofindustrial production by setting minimally safeconditions of work, establishing private actionsat law, and spreading the risks (of what couldnot be prevented) through fair trade practices,workmen’s compensation, and pension systems,as well as labor unions, private mutual help,and insurance. Safety was one of several objec-tives promoted by the system of instrumentsregulating relations between capital and labor(cf. Baker 2002, Ewald 2002, Friedman 1967,Orren 1991, Welke 2001, Witt 2004).

In a sense, the invention of risk,3 and with itwidespread insurance and regulation of work-places, products, and markets, created the ba-sis of a new social contract. Responsibility wastransferred from the person to the situation—the job, the firm, the union, or the collectivenation—forgoing reliance on any individual’sbehavior, whether worker or boss. Eschewinginterest in specific causality, and thus individ-ual liability, this collectivized regime acknowl-edged a general source of insecurity in tech-nology and responded with a set of generalizedresponses, albeit after extended and sometimestragic struggle. Where responsibility had pre-viously rested on the idea of proximate causeand a selective distribution of costs based onliability as a consequence of imprudence, thelate nineteenth and early twentieth century in-dustrial and business regulation redistributedcosts to collectivities, offering compensationand reparation, if not safety and security. Re-sponsibility was “no longer the attribute of asubject, but rather a consequence of a social

3Accounts vary as to the moment when probabilistic calcu-lation about hazardous events became a recognized practice(see Hacking 1990, 2003).

344 Silbey

Ann

u. R

ev. S

ocio

l. 20

09.3

5:34

1-36

9. D

ownl

oade

d fro

m a

rjour

nals.

annu

alre

view

s.org

by M

ASS

ACH

USE

TTS

INST

ITU

TE O

F TE

CHN

OLO

GY

on

09/1

0/09

. For

per

sona

l use

onl

y.

Page 5: Taming Prometheus: Talk About Safety and Cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf · 2009-10-27 · arship about safety culture. Since the 1990s, identifying broken or

ANRV381-SO35-17 ARI 5 June 2009 9:28

fact” (Ewald 2002, p. 279). One was no longer“responsible because one is free by nature andcould therefore have acted differently, but be-cause society judges it ‘fair’” to place responsi-bility in a particular social location, that is, tocause a particular person or collectivity to bearthe financial costs of the injury. In short, thecosts of technological consequences were dis-persed, “the source and foundation of respon-sibility . . . displaced from the individual ontosociety” (p. 279).

Talk about safety culture offers a new twist,or possible reversion, in the allocation of re-sponsibility for technological failures, a returnto the nineteenth century and earlier regimesof individual responsibility, but in a context ofmore hazardous and global technologies. Af-ter several decades of sustained attack by ad-vocates seeking supposedly more efficient andjust allocations of goods through unregulatedmarkets, the regime of collective responsibil-ity has been dismantled, replaced by one ofinstitutional flexibility. Rather than attempt-ing to mitigate and distribute risk, contempo-rary policies and practices embrace risk (Baker& Simon 2002, p. 1). Embracing risk meansto “conceive and address social problems interms of risk”—calculated probability of haz-ard (Heimer 1988, Simon 1988). Human life,including the prospects of human autonomyand agency, is now conceived in very much thesame way and analyzed with the same tools weemploy to understand and manipulate physicalmatter: ordered in all its important aspects byinstrumental and probabilistic calculation andmechanical regulation (Bittner 1983).

Unfortunately, risk analysis and discoursenarrow consideration of legitimate alterna-tives while nonetheless sustaining the appear-ance of broad pluralism (cf. Habermas 1975).Because of the assumption that realism residesexclusively in science, reflexive observationand critique as well as unmeasured variablesare excluded from official risk discourses. As aconsequence, allegedly empirical analyses be-come solipsistic, focusing exclusively on themethods and epistemologies that are inter-nal to technological instrumentalism (Deutch

& Lester 2004; Lash & Wynne 1992, p. 4).Heimer (1985) identified the illusory natureof this supposed realism in her prescient anal-ysis of the reactive nature of risk, demon-strating how risk (probabilities of threats tosafety and security) would necessarily elude ourgrasp because each effort to control risk trans-formed its probabilities in an ever-escalatingspiral.

Embracing risk also refers to the specificpolicies and techniques instituted over the pastseveral decades to undo the system of collectivesecurity. “Across a wide range of institutions, of-ficials are now as concerned about the perverseeffects of . . . risk shifting [i.e., risk sharing], asthey are about the risks [probabilities of hazard]being shifted” (Baker & Simon 2002, p. 4). Inplace of the regime of risk containment, propo-nents of flexibility argue that safety and securitycan be achieved more effectively by embracingand privatizing risk.

Although pro-privatization market policiesthat attempt to “make people more individu-ally accountable for risk” (Baker & Simon 2002,p. 1) are often justified as natural and efficient,there is nothing natural about them (Klein2007, Mackenzie 2006). Just as risk-spreadingwas achieved through the efforts of financial andmoral entrepreneurs to transform common, of-ten religious, conceptions of morality, respon-sibility, and money (Becker 1963; Zelizer 1979,1997), contemporary risk-embracing policiesare also the outcome of ideological struggles. Ifin the nineteenth century marketing life insur-ance required a modification in what it meantto protect one’s family by providing materiallyfor them after death rather than seeming toearn a profit from death, so too risk-embracingpolicies in the twentieth and twenty-first cen-turies require a similar redefinition in what itmeans to be responsible, productive citizens.Contemporary moral entrepreneurs energeti-cally promote risk taking rather than risk shar-ing as morally desirable; the individual moreeffectively provides for family security, it isclaimed, by participating in a competitive, ex-panding, market economy than by relying ongovernment-constructed safety nets.

www.annualreviews.org • Safety and Culture 345

Ann

u. R

ev. S

ocio

l. 20

09.3

5:34

1-36

9. D

ownl

oade

d fro

m a

rjour

nals.

annu

alre

view

s.org

by M

ASS

ACH

USE

TTS

INST

ITU

TE O

F TE

CHN

OLO

GY

on

09/1

0/09

. For

per

sona

l use

onl

y.

Page 6: Taming Prometheus: Talk About Safety and Cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf · 2009-10-27 · arship about safety culture. Since the 1990s, identifying broken or

ANRV381-SO35-17 ARI 5 June 2009 9:28

This moral entrepreneurship directs ourattention to safety culture because the conceptarises as a means of managing technologicalrisk, just as the previous security regime hasbeen successfully dismantled. This is not tosay that the nineteenth to twentieth centuryregulatory system was perfect, nor as good asit might have been, nor that it prevented orrepaired all or most technological damage. Itwas, however, a means of distributing, if notpreventing, the costs of injuries. Yet, for most ofthe twentieth century, risk analysts themselvesexpended a good part of their energy attackingthis system, legitimating the risks undertaken,reassuring the public that they were nonethe-less being protected, and second-guessing theregulatory “agencies’ attempts to do a verydifficult job” (Perrow 1999 [1984], p. 307).Paradoxically, many risk analysts regularlyassessed the risks of regulation more negativelythan the risks of the hazards themselves (e.g.,Deutch & Lester 2004).

With a commitment to the idea of effi-cient markets, critics of regulation producedaccounts of government regulation as publiclysanctioned coercion sought by private firms toconsolidate market power, inhibit price com-petition, and limit entry. As a result, critics ar-gued, the system produced inefficiencies, a lackof price competition, higher costs, and over-capitalization ( Joskow & Noll 1977, Joskow& Rose 1989; cf. Schneiberg & Bartley 2008).Interestingly, these challenges to governmentregulation rarely valued as highly consumer ser-vice, product quality, and environmental pro-tection that were also promoted by regulation.The accounts of corporate capture undermin-ing regulatory effectiveness (Bernstein 1955;Derthick & Quirk 1985; Peltzman et al. 1989;Vogel 1981, 1986) also ignored the new so-cial regulation in safety, consumer protection,and civil rights. Perhaps the focus on marketcontrol, and a latent hostility to the strugglesbetween labor and capital and between man-ufacturers and consumers that became ideo-logically entwined with the struggles againstregulation, blinded scholars to non-economicvariables such as safety that had also been part

of the regulatory regime. For whatever reasons,ideological or coincidental, the focus on mar-ket competition as the central guarantor of pro-ductivity and efficiency overlooked constituentstructural features of the regime of governmentregulation, insurance, and liability that miti-gated risk by promoting countervailing inter-ests in safety and responsibility.

Notably, the nineteenth to twentieth cen-tury solidarity regime was “not only a paradigmof compensation but also one of prevention”(Ewald 2002, p. 281). Bottom-line profit takingrequired diligent efforts not simply to estimatecosts and prices but also to prevent losses, thatis, accidents and disasters. A host of institutionalpractices and organizations promoted respon-sibility by enacting prevention, in this way re-ducing costs and increasing profit. For example,

the great life insurance companies were pi-oneers in epidemiology and public health.The fire insurance industry formed Under-writer’s Laboratories, which tests and certi-fies the safety of household appliances andother electrical equipment. Insurance compa-nies seeking to cut their fire losses formed thefirst fire departments. More recently, healthinsurance companies have been behind manyefforts to compare, test, and measure the ef-fectiveness of medical procedures (Baker &Simon 2002, p. 8; cf. Knowles 2007a,b).

Under the solidarity regime, industries, indi-vidual firms, and labor unions collectively pro-moted forms of social control, workplace disci-pline, and self-governance that were expectedto reduce injuries and thus costs for the variousorganizations (Ericson et al. 2003). Minimally,they identified the worst offenders.

Insurance companies have traditionally alsotaken precautions to mitigate financial lossesnot only through safer practices but throughinvestment of premiums and reinsurance. Thepost-1929 American banking and financialindustry regulations purposively segregated dif-ferent financial functions and markets to pre-vent excessive losses in one activity fromcontaminating related industries and parallelsilos in the financial markets. However, since

346 Silbey

Ann

u. R

ev. S

ocio

l. 20

09.3

5:34

1-36

9. D

ownl

oade

d fro

m a

rjour

nals.

annu

alre

view

s.org

by M

ASS

ACH

USE

TTS

INST

ITU

TE O

F TE

CHN

OLO

GY

on

09/1

0/09

. For

per

sona

l use

onl

y.

Page 7: Taming Prometheus: Talk About Safety and Cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf · 2009-10-27 · arship about safety culture. Since the 1990s, identifying broken or

ANRV381-SO35-17 ARI 5 June 2009 9:28

the systematic deconstruction of this regulatoryregime began in the 1980s, insurance firms, likemany corporations, have become ever more fi-nancialized, earning profit more directly frominvestments in global financial markets thanfrom selling insurance. With the invention ofderivatives and similar instruments, a wider ar-ray of firms have been transformed into finan-cial rather than productive entities. Financial-ization means that capital and business risks aredisaggregated, recombined in heterogeneousassets that are bought and sold globally, anddistributed among myriad other firms, share-holders, and markets. Losses in these assetsare supposedly protected through insuranceswaps. There is an indirect but substantial con-sequence for safety in this financialized systembecause there is less interest in the reliability ofthe specific products manufactured or servicesoffered. Less financial risk means reduced at-tention to the associated practices that encour-age risk prevention and enhance safety.4

Finally, we cannot ignore the role of civil lit-igation as part of the twentieth century solidar-ity regime and its twenty-first century demise.The expansion of rights and remedies that be-gan slowly with the New Deal but grew rapidlypost–World War II came with “a great a burst oflegalization.” While “regulation proliferated,extending to aspects of life previously unsu-pervised by the state” (Galanter 2006, p. 4),civil litigation independently generated rights.Although some commentators describe this asa litigation explosion (Friedman 1985, Kagan2003, Lieberman 1981), it is actually a shift:from contract litigation dominating in the nine-teenth century to tort litigation predominat-ing in the twentieth century (Galanter 1983).

4The financial downturn that escalated to a worldwide cri-sis in 2008 can be attributed in part to just these practices.In the financial markets, not only was the safety of the pro-duced material goods less salient, but the safety or securityof the financial assets was of less concern because of defaultswaps, hedging, and insurance on bets that finally unraveled.Rather than encouraging responsibility, the layered systemof disaggregation and recombination buttressed by hedgesand insurance undermined critical or responsible decisionmaking.

Although strict liability is not the generouslyabsurd protector of irresponsibility that criticsclaim it to be (Burke 2004, Holtom & McCann2004), there is no doubt that the twentieth cen-tury produced, “by any measure, a great dealmore law” (Galanter 2006, p. 5). The legal pro-fession exploded from 1 lawyer for every 627Americans in 1960 to 1 lawyer for every 264in 2006. Spending on law increased, as did cel-ebration of lawyers and legal work in popularmedia and film ( J. Silbey 2001, 2004, 2005,2007a,b).

In canonical Newtonian fashion, the expan-sion of law caused an energetic backlash. Theearly and mid-twentieth century cries that thelegal system “failed to provide justice to theweak—gave way to a responsive critique thatthe nation was afflicted by ‘too much law’”(Galanter 2006, p. 5, citing Galanter 1994). Onealleged legal crisis followed another, from prod-uct liability to overcrowded courts to medicalmalpractice (Baker 2005). Calls for tort reformand informal dispute resolution as alternativesto litigation became common, the centerpieceof organized professional and political cam-paigns (Burke 2004, Silbey & Sarat 1988). WithRonald Reagan’s election to the U.S. presidencyin 1980 and subsequent Republican presidents,nominees to the federal courts were systemati-cally screened for their ideological conformitywith a less law, less rights agenda. By Septem-ber 2008, 60% of active federal judges withthis agenda had been appointed, and, as a con-sequence, the federal courts have joined themovement to embrace risk, becoming anothervoice promoting individual, rather than shared,assumption of risk (Scherer 2005).

Thus, from the middle nineteenth throughthe late twentieth centuries, industrial and in-surance firms, individual families, the civil liti-gation system, governmental regulatory agen-cies, and labor unions built and sustained asafety net of collective responsibility; they re-inforced each other within a tapestry of or-ganizations and institutions whose interestscompeted, yet coalesced to support relativelysafer practices. The demise of those structuralcomponents is precisely what underwrites the

www.annualreviews.org • Safety and Culture 347

Ann

u. R

ev. S

ocio

l. 20

09.3

5:34

1-36

9. D

ownl

oade

d fro

m a

rjour

nals.

annu

alre

view

s.org

by M

ASS

ACH

USE

TTS

INST

ITU

TE O

F TE

CHN

OLO

GY

on

09/1

0/09

. For

per

sona

l use

onl

y.

Page 8: Taming Prometheus: Talk About Safety and Cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf · 2009-10-27 · arship about safety culture. Since the 1990s, identifying broken or

ANRV381-SO35-17 ARI 5 June 2009 9:28

contemporary focus on safety culture as a meansof managing technological hazards. If we do nothave empowered regulatory agencies, judicialsupport for tort litigation, organized labor, andinsurance companies with a financial interest inthe safety and longevity of their customers, wehave lost a good part of what made the previousparadigm work to the extent it did for as longas it did.

Talk of safety culture flourishes at the verymoment when advocates extend the logic of in-dividual choice, self-governance, and rationalaction from the market to all social domains.Just as historic liberalism was “concerned withsetting limits on the exercise of political orpublic authority, viewing unwarranted inter-ventions in the market as harmful,” contem-porary neoliberalism5 promotes markets “as aprinciple not only for limiting government butalso for rationalizing authority and social rela-tions in general” (Shamir 2006, p. 1). Througha process of so-called responsibilization, “pre-disposing social actors to assume responsibilityfor their actions” (Shamir 2008, p. 10), thesepolicies simultaneously empower individuals todiscipline themselves while distributing, as inthe nineteenth century prudential regime, toeach the costs of that discipline and the conse-quences for the lack thereof (Rose 1989, 1999).As a concept, responsibilization names effortsto both cultivate and trust the moral agency ofrational actors as the foundation of individualand collective well-being (Shamir 2008, p. 11).

Because the propagation and inculcation ofsafety culture is only one approach to enhancingthe reliability and safety of complex technolo-gies, it is not unreasonable to wonder whethersafety culture, focused on individual partici-pants’ self-determined contributions to the sys-tem as a whole, might not be described as an

5The term neoliberalism is conventionally used to refer to thepolicies advocating deregulation, privatization, and relianceon markets for both distribution and coordination, but alsoincludes a set of fiscal, tax, and trade liberalization policiesthat is sometimes referred to as the Washington Consensusbecause of support by the International Monetary Fund andthe World Bank.

expression of responsibilization, this neo-liberal technique of governance. Without nec-essarily intending to promote policies of dereg-ulation and privatization, the celebration ofsafety culture as a means of managing thehazardous consequences of complex systemsexpresses what Weber described as an elec-tive affinity, phenomena that do not necessar-ily cause one another but nonetheless vary to-gether. In the next section, I explore calls forand accounts of safety culture to extract fromthis diverse literature the purported meaningsand relationships of safety and responsibility.

TALK ABOUT SAFETY CULTUREBetween 2000 and 2007, academic literatureand popular media exploded with referencesto safety culture. Over 2250 articles in news-papers, magazines, scholarly journals, and lawreviews in an eight-year period included ref-erences to safety culture, whereas only 570references were found in the prior decade.Before 1980, I could find no references in pop-ular or academic literature.6 Although the un-precedented appearance and the rapidly esca-lating use of the concept seem to support myhypothesis of ideological affinities between talkabout safety culture and the dismantling of theregulatory state, we should look more closelyat what people say to interpret what they meanwhen they speak about safety culture.

The earliest uses of safety culture in newspa-pers and popular media invoke the term primar-ily in discussions of nuclear power, energy gen-eration, and weapons production to describewithin organizations an “ingrained philosophythat safety comes first” (Diamond 1986). Onenon-nuclear reference to a British railroad ac-cident is illustrative because, even in this lesscommon venue, a deteriorating safety culturewas offered as the explanation for what went

6I searched LexisNexis, JSTOR, and the Engineering Villagedatabases for the years between 1945 and 2008, using thephrases safety culture, safety (and) culture, and culture ofsafety within two words of each other.

348 Silbey

Ann

u. R

ev. S

ocio

l. 20

09.3

5:34

1-36

9. D

ownl

oade

d fro

m a

rjour

nals.

annu

alre

view

s.org

by M

ASS

ACH

USE

TTS

INST

ITU

TE O

F TE

CHN

OLO

GY

on

09/1

0/09

. For

per

sona

l use

onl

y.

Page 9: Taming Prometheus: Talk About Safety and Cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf · 2009-10-27 · arship about safety culture. Since the 1990s, identifying broken or

ANRV381-SO35-17 ARI 5 June 2009 9:28

wrong and should be improved to prevent fu-ture accidents.7 Mechanical error compoundedby lax management processes was named as thecause of the accident. Nonetheless, the judgeheading the accident inquiry focused his rec-ommendations for improving the safety culturenot on the management of the system or thecommunications processes within the railroadhierarchy, but on the laborers, calling for “radi-cal improvements in recruiting and training andan end to excessive overtime” (Diamond 1986).

Although talk about safety culture emergedduring the 1980s when major accidents atThree Mile Island, Bhopal, and Chernobylweakened public confidence in complex tech-nologies, only well into the 1990s did talk aboutsafety culture become a common phenomenon.Although thousands of newspaper articles werewritten about the March 28, 1979, partial melt-down of Unit 2 at the Three Mile Island nuclearpower plant in Dauphin County, Pennsylvania,none spoke about the plant’s safety culture. Wefirst see accounts of lax safety culture followingthe December 3, 1984, explosion of a UnionCarbide plant synthesizing and packaging thepesticide methyl isocyanate in Bhopal in theIndian state of Madhya Pradesh.

In these early references, the phrase is in-voked primarily to denote culture in its morecolloquially circulating meaning: to suggestthat nations vary in their respect for safety.Because the Indian partners in the UnionCarbide plant did not share the American cul-ture (which implicitly valued safety), they were,by inference, responsible for the accident. JohnHoltzman, spokesman for the Chemical Man-ufacturers Association in Washington, DC,pointed to “the differences in ‘safety culture’ be-tween the US and other countries. . . . We havea certain sense of safety. You see it in campaignslike ‘buckle up.’ It’s not necessarily the same

7During the morning rush hours of December 12, 1988,35 people were killed and another 100 injured when “onecommuter train rammed the rear of a stopped commutertrain, outside busy Clapham Junction in south London. Thewreckage was then struck by a freight train” (Associated Press1989).

elsewhere. It’s difficult to enforce our cultureon another country,” Holtzman said, “espe-cially when the other country seems willing totake risks in exchange for speedy technologicaladvance” (Kiefer 1984).

This use of safety culture to name variationsin national cultures, reminiscent of historic jus-tifications for colonial rule, did not stick. Veryquickly, it became apparent that the preexist-ing safety problems in the Bhopal plant werenot peculiar to Bhopal, or to India. AlthoughUnion Carbide had insisted that the conditionsin Bhopal were unique, one of its sister plantsin Institute, West Virginia, produced a sim-ilar accident just eight months later (Perrow1999 [1984], p. 358). Although an OccupationalSafety and Health Administration (OSHA) in-spection had previously declared the WestVirginia plant in good working order, theOSHA inspection following the explosiondeclared that this was “an accident waitingto happen,” citing hundreds of longstand-ing, “constant, willful, violations” (quoted inPerrow 1999 [1984], p. 359). Clearly, the dif-ferent national cultures of India and the UnitedStates could not explain these accidents, whichseemed to have had some other source. No onementioned the role of lax inspections as part ofthe safety culture. With the exception of onestory about how E.I. DuPont de Nemours &Co is “recognized within industry for its [exem-plary] safety” practices (Brooks et al. 1986), theearly references in popular media to safety cul-ture do little more than invoke the term. Theyprovide little specification of what activities, re-sponsibilities, or symbolic representations con-tribute to a safety culture.

In professional and scholarly literature, thephrase safety culture first appears in a 1986report of the International Atomic EnergyAgency (IAEA) on the Chernobyl accident.Three years later, a second reference by theU.S. Nuclear Regulatory Commission (1989)states that plant management “has a duty andobligation to foster the development of a ‘safetyculture’ at each facility and throughout the fa-cility, that assures safe operations.” After fiveyears in common usage, an IAEA report defined

www.annualreviews.org • Safety and Culture 349

Ann

u. R

ev. S

ocio

l. 20

09.3

5:34

1-36

9. D

ownl

oade

d fro

m a

rjour

nals.

annu

alre

view

s.org

by M

ASS

ACH

USE

TTS

INST

ITU

TE O

F TE

CHN

OLO

GY

on

09/1

0/09

. For

per

sona

l use

onl

y.

Page 10: Taming Prometheus: Talk About Safety and Cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf · 2009-10-27 · arship about safety culture. Since the 1990s, identifying broken or

ANRV381-SO35-17 ARI 5 June 2009 9:28

safety culture as “that assembly of character-istics and attitudes in organizations and indi-viduals which establishes that, as an overridingpriority, nuclear power safety issues receive at-tention warranted by their significance” (IAEA1991, p. 8; 1992). As Perin (2005) comments inher detailed study of four nuclear power plants,“Determining that significance in particularcontexts is . . . the crux of the quandary” (p. 14).

For the past two decades, researchers havebeen actively engaged in analyses of safety cul-ture, with the vast majority of work produced inengineering, management, and psychology, anda smattering of mostly critical work producedin sociology and political science. If we lookacross these fields, we find variation in the waysin which safety culture is invoked, althoughthere is a great deal of conceptual importationfrom the social sciences to what we may thinkof as applied social science in engineering andmanagement.

The General Concept of CultureCulture is an actively contested concept; itsimportation into organizational and engineer-ing analyses is equally contentious. Confusionderives in part from intermingling two mean-ings of culture: a concrete world of beliefs andpractices associated with a particular group andan analytic tool of social analysis referring toa system of symbols and meanings and theirassociated social practices, both the productand context of social action. The analytic con-cept is invoked (a) to recognize signs, perfor-mances, actions, transactions, and meanings asinseparable, yet (b) “to disentangle, for the pur-pose of analysis [only], the semiotic influenceson action from the other sorts of influences—demographic, geographical, biological, techno-logical, economic, and so on—that they are nec-essarily mixed with in any concrete sequence ofbehavior” (Sewell 2005, p. 160). Thus, orga-nizational culture and safety culture are termsused to emphasize that organizational and sys-tem performances are not confined to formallyspecified components, nor to language alone.Although formal organizational attributes and

human interactions share symbolic and cogni-tive resources, many cultural resources are dis-crete, local, and intended for specific purposes.Nonetheless, it is possible (c) to observe generalpatterns so that we are able to speak of a culture,or cultural system, at specified scales and lev-els of social organization. “System and practiceare complementary concepts: each presupposesthe other” (Sewell 2005, p. 164),8 althoughthe constituent practices are neither uniform,logical, static, nor autonomous. As a collec-tion of semiotic resources deployed in interac-tions (Swidler 1986), “culture is not a power,something to which social events, behaviors,institutions, or processes can be causally at-tributed; it is a context, something within which[events, behaviors, institutions, and processes]can be intelligibly—that is, thickly—described”(Geertz 1973, p. 14). (d ) Variation and conflictconcerning the meaning and use of these sym-bols and resources is likely and expected be-cause at its core, culture “is an intricate systemof claims about how to understand the worldand act in it” (Perin 2005, p. xii; cf. Helmreich2001).

Culture as Causal AttitudeFor some authors, safety culture is understoodas a measurable, instrumental source composedof individual attitudes and organizational be-havior, or conversely as a measurable productof values, attitudes, competencies, and behav-iors that are themselves the cause of other ac-tions (Cox & Cox 1991, Geller 1994, Glennon1982, Lee 1996, Ostrom et al. 1993). Inboth uses, culture “determine[s] the commit-ment to, and the style and proficiency of,an organization’s health and safety programs”

8“The employment of a symbol,” Sewell (2005, p. 164) writes,“can be expected to accomplish a particular goal only becausesymbols have more or less determinate meanings—meaningsspecified by their systematically structured relations to othersymbols. But it is equally true that the system has no exis-tence apart from the succession of practices that instantiate,reproduce, or—most interestingly—transform it. Hence, asystem implies practice. System and practice constitute anindissoluble duality or dialectic.”

350 Silbey

Ann

u. R

ev. S

ocio

l. 20

09.3

5:34

1-36

9. D

ownl

oade

d fro

m a

rjour

nals.

annu

alre

view

s.org

by M

ASS

ACH

USE

TTS

INST

ITU

TE O

F TE

CHN

OLO

GY

on

09/1

0/09

. For

per

sona

l use

onl

y.

Page 11: Taming Prometheus: Talk About Safety and Cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf · 2009-10-27 · arship about safety culture. Since the 1990s, identifying broken or

ANRV381-SO35-17 ARI 5 June 2009 9:28

(Reason 1997, p. 194, citing Booth, UK Healthand Safety Commission 1993). Whether thefirst mover or an intermediate mechanism, “anideal safety culture is the engine that continuesto propel the system toward the goal of max-imum safety health, regardless of the leader-ship’s personality or current commercial con-cerns” (Reason 1997, p. 195). Culture as theultimate, intermediate, or proximate cause of-ten leaves unspecified the particular mechanismthat shapes the safe or unsafe outcomes of theorganization or technology (but see Glennon1982, Zohar 1980), with much of the man-agement and engineering literatures debatingexactly this: how to operationalize and mea-sure both the mechanism and the outcome.Clearly, this conception of safety culture be-lies exactly that thick description of practiceand system that cultural analysis entails (Fischer2006, Geertz 1973, Silbey 2005b).

A persistent muddle in this usage derives, inpart, from the aggregation over time and acrossprofessional communities of concepts devel-oped to name the emergent properties of socialinteractions not captured by the specificationof components, stakeholders, objectives, func-tions, and resources of formal organizations.There seems to be a recurring cycle in whichheretofore unnamed or unperceived phenom-ena are recognized as playing a role in organizedaction. A construct is created to name what ap-pear to be stable, multidimensional, shared fea-tures of organized practices that had not yetbeen captured by existing categories and mea-sures. All this is fine and congruent with thebest sociology. However, once the phenomenaare named, some researchers attempt to spec-ify and measure them more concretely; dis-parate results generate continuing debate aboutdifferent conceptualizations and measurementtools (Cooper 2000, Guldenmund 2000). Asempirical results outpace the purportedly de-scriptive models, new constructs are offered toname the persistent, yet elusive effluent of un-predicted events, now hypothesized as intan-gible cultural causes, fueling additional debate.Thus, talk of safety culture emerged as a sub-set from prior talk about organizational culture

(Beamish 2002; Bourrier 1996; Carroll 1998a,b;Cooper 2000; Schein 1992), and both organi-zational and safety culture developed alongsideconcepts of organizational climate and safetyclimate, generating a bewildering mix of con-cepts and measures. Numerous efforts have at-tempted to parse these terms, with negligibletheoretical advance (Denison 1996, Zhang et al.2002).

To some extent, the conceptual puzzle is en-ergized by occupational and professional com-petitions, different disciplinary communitiespushing in one direction or another, using pre-ferred concepts and tools to authorize expertadvice about how to design systems, assess per-formance, and manage them on the basis of thisinformation (Abbott 1988). Although organiza-tional and safety culture can and should be nor-matively neutral, the terms have usually beendeployed to emphasize a positive aspect of or-ganizations, one that leads to increased safetyby fostering, with minimal surveillance, an effi-cient and reliable workforce sensitized to safetyissues. The framing generates ellipses that in-vite further conceptual elaboration to accountfor what has been excluded in the particularnormative tilt of the concept. Although someauthors view culture as something that can bechanged—managed to improve organizationalperformance—and seek to develop models togenerate more effective safety culture (Carroll1998b, Cooper 2000), others adopt more disin-terested formulations (Beamish 2002, O’Reilly& Chatman 1996).

Guldenmund’s systematic review of the lit-erature through 2000 describes organizationaland safety culture as general frames determin-ing organizational and safety climates.

The term organizational climate was coined torefer to a global, integrating concept underly-ing most organizational events and processes.Nowadays, this concept is referred to by theterm organizational culture whereas organi-zational climate has come to mean more andmore the overt manifestation of culture withinan organization. Therefore, climate followsnaturally from culture, or, put another way,

www.annualreviews.org • Safety and Culture 351

Ann

u. R

ev. S

ocio

l. 20

09.3

5:34

1-36

9. D

ownl

oade

d fro

m a

rjour

nals.

annu

alre

view

s.org

by M

ASS

ACH

USE

TTS

INST

ITU

TE O

F TE

CHN

OLO

GY

on

09/1

0/09

. For

per

sona

l use

onl

y.

Page 12: Taming Prometheus: Talk About Safety and Cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf · 2009-10-27 · arship about safety culture. Since the 1990s, identifying broken or

ANRV381-SO35-17 ARI 5 June 2009 9:28

organizational culture expresses itself throughorganizational climate (Guldenmund 2000,p. 221).

Summarizing across dozens of uses, Zhanget al. (2002, p. 8) suggest that safety culture beunderstood as

the enduring value and priority placed onworker and public safety by everyone in ev-ery group at every level of an organization. Itrefers to the extent to which individuals andgroups will commit to personal responsibil-ity for safety, act to preserve, enhance andcommunicate safety concerns, strive to ac-tively learn, adapt and modify (both individualand organizational) behavior based on lessonslearned from mistakes, and be rewarded in amanner consistent with these values.

Although these uses of safety culture referto “the shared values, beliefs, assumptions,and norms which may govern organizationaldecisionmaking . . . about safety” (Ciaverelli &Figlock 1996), much of the research observes,measures, and assesses safety culture throughsurvey instruments collecting individual ex-pressions, attitudes, and beliefs, or what othersdefine as safety climate. In effect, the termsare collapsed, so that safety climate becomes“the temporal state measure of safety culture,”assessed and evaluated in terms of the degreeof coherence and commonality “among indi-vidual perceptions of the organization” (Zhanget al. 2002, p. 10). Some studies recognize theinadequacy of assessing a diffuse, emergentphenomenon such as culture through individ-ual measures, and as a consequence add a groupor aggregate measure to designate that whichis shared (Cox & Cox 1991) or applies to thegroup (Lee 1996), the set (Pidgeon 1997, 1998),or the assembly. Because a good part of the lit-erature on safety culture seeks to develop toolsto improve organizational performance, smalllinguistic variations in conceptualization of theoften intangible system of signs and practices,even if not named as such, become critically de-terminant variations for empirical researchers

and management toolmakers (Humphrey et al.2007, Morgeson & Humphrey 2006). Theresulting literature is littered with competingmodels, instruments, types of analysis, andmeasures, providing much occupation butunreliable instruction or guidance (Cooper2000, Guldenmund 2000), offering more “heatthan light” (O’Reilly & Chatman 1996, p. 159).Nonetheless, the repeated efforts to specify andmeasure safety culture in terms of individualattitudes and behaviors to “foster a reliableworkforce” who will “commit to personal re-sponsibility” (Zhang et al. 2002) illustrates wellaffinities between policies of responsibilizationand advocacy of safety culture.

The Report of the BP U.S. Refineries Inde-pendent Safety Review Panel (Baker Panel 2007,hereafter BP Report 2007) marks, perhaps, thequintessence of talk about safety culture. Fol-lowing a catastrophic accident at a BP refin-ery in Texas City, Texas, on March 23, 2005,resulting in 15 deaths and more than 170 in-jured persons, as well as significant economicloss, the U.S. Chemical Safety and Hazard In-vestigation Board recommended, with expliciturgency, that BP initiate its own parallel inves-tigation into its safety management practices.9

The BP Report, issued 21 months later, de-scribed what it repeatedly called a “damagedsafety culture.” The phrase safety culture ap-pears 3 times on the opening page and morethan 390 times in the approximately 150-pagedocument. Clearly, safety culture has becomethe mantra for technologically complex andhazardous organizations.

The report claims early and often that BPhas “come to appreciate the importance of cul-tural factors in promoting good process safetyperformance” (BP Report 2007, p. 59) butnonetheless adopted a rather shallow notionof safety culture that focused on individual ac-tions rather than on systemic processes. Thus,alongside safety culture, the first of its three

9BP had experienced “two other fatal safety incidents in 2004,a major process-related hydrogen fire on July 28, 2005, andanother serious incident on August 10, 2005” (BP Report2007).

352 Silbey

Ann

u. R

ev. S

ocio

l. 20

09.3

5:34

1-36

9. D

ownl

oade

d fro

m a

rjour

nals.

annu

alre

view

s.org

by M

ASS

ACH

USE

TTS

INST

ITU

TE O

F TE

CHN

OLO

GY

on

09/1

0/09

. For

per

sona

l use

onl

y.

Page 13: Taming Prometheus: Talk About Safety and Cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf · 2009-10-27 · arship about safety culture. Since the 1990s, identifying broken or

ANRV381-SO35-17 ARI 5 June 2009 9:28

high-level findings, the report identified theneed for process safety management systemsand performance evaluation, corrective action,and corporate oversight. The report repeatedlystates that, in contrast to the individualized no-tion of safety promoted by BP, safety is ratherthe responsibility of the corporate board, whichmust exercise leadership by establishing safetyas a core value across all its refineries—exactlywhat it had failed to do. “Absent a healthy safetyculture, even the best safety management sys-tems will be largely ineffective” (BP Report2007, p. 59).

Thus, the report specifically distinguishesbetween personal safety (slips and falls) and pro-cess safety, that is, safety built into the designand engineering of the facilities, management,and maintenance, exactly what BP had failed todo. In its own inquiry, BP had used interviewand survey data that revealed significant vari-ation in attitudes and perceptions about safetywithin and across five plant sites. On the ba-sis of these attitudinal variations, BP declaredits process safety damaged. By relying on at-titudes as indicators, BP clearly identified in-dividuals rather than the system as the centralsafety focus. In contrast, the panel report de-scribed systemic problems such as underfundedmanagement of the U.S. plants and underre-sourced safety programs alongside an abun-dance of discrete safety initiatives that over-loaded and underresourced management andworkers. Similarly, in keeping with the respon-sibilization of lower-level workers, BP had citedworker fatigue as one of the root causes of theTexas City accident, despite the fact that fa-tigue was common across sites, including thosethat did not experience a major accident. Thepanel report argued that BP’s focus on individ-ual behaviors and errors ignored and failed toaddress the root causes of accidents: fatigue andsensory overload due to management policies,in this instance, specifically policies that reliedon routine overtime to meet production needsrather than on hiring additional employees. Byemphasizing personal safety, the report claimsthat BP leadership failed to establish processsafety or system safety as a core value and tar-

get for investment. Because BP relied heavilyon individual injury rates to assess safety perfor-mance and because these personal safety indica-tors showed improvement, BP was mistakenlyconfident that it was addressing process (designand management) risks. The report nicely high-lights a feature of the culture that mistakenlydefined safety as individual responsibility.

Culture as Engineered OrganizationOther scholars speak less about organiza-tional or safety culture in general than specif-ically about an organization’s learning culture(Carroll 1998a,b), especially in high-reliabilityorganizations (HROs) (Eisenhardt 1993, Kleinet al. 1995, La Porte & Consolini 1991, LaPorte & Rochlin 1994, Roberts & Rousseau1989, Roberts et al. 1994, Rochlin et al. 1987,Schulman 1993, Weick 1987, Weick et al.1999). Like the previous category, however, themain focus of these authors has been to under-stand how culture leads to particular outcomes,specifically, reliability and efficiency. Again, cul-ture is instrumentalized in order to manipulateand manage its consequences. This work dif-fers from the previous category, however, by itsexplicit articulation of the organizational con-figuration and practices that should make orga-nizations more reliably safe. Nonetheless, theHRO literature also seems to invoke a notionof culture as homogeneous and instrumentallymalleable.

HRO analysts suggest that good organiza-tional design with built-in redundancies, de-centralized decision making for prompt in situresponses, and extensive training alongsidetrial-and-error learning can create high re-liability, that is, safety, even in organiza-tions with particularly hazardous technologies(e.g., Marone & Woodhouse 1986, Weick &Sutcliffe 2001). Continuous operations andlearning that allow for backup to compen-sate for failures will lead, HRO theorists ar-gue, to reduced error rates and safer outcomes.Organizational learning takes place throughtrial and error, supplemented by anticipa-tory simulations. Although HRO scholars most

www.annualreviews.org • Safety and Culture 353

Ann

u. R

ev. S

ocio

l. 20

09.3

5:34

1-36

9. D

ownl

oade

d fro

m a

rjour

nals.

annu

alre

view

s.org

by M

ASS

ACH

USE

TTS

INST

ITU

TE O

F TE

CHN

OLO

GY

on

09/1

0/09

. For

per

sona

l use

onl

y.

Page 14: Taming Prometheus: Talk About Safety and Cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf · 2009-10-27 · arship about safety culture. Since the 1990s, identifying broken or

ANRV381-SO35-17 ARI 5 June 2009 9:28

often describe the prevalence of these condi-tions in military-style organizations, each ofwhich holds itself to a failure-free standard ofperformance, (e.g., nuclear submarines, nuclearpower plants, aircraft carriers, space shuttles,and air traffic control), they argue that suchorganizational practices and what are calledprocesses of collective mindfulness are appro-priate in nonmilitary organizations as well.HROs provide, authors claim, “a unique win-dow into organizational effectiveness under try-ing,” dangerous (Weick et al. 1999, p. 81), andhigh-velocity environments (Eisenhardt 1993).

HRO scholars focus on what they claim isa distinctive though not unique set of cogni-tive processes that prevail in the better plantsand systems. These five orientations (preoccu-pation with and proxies for failure, reluctanceto simplify interpretations, sensitivity to op-erations, commitment to and capabilities forresilience, and resistance to over-structure orpreference to under-specify the system) leadto mindfulness—a feature of safe organizations(Weick et al. 1999, pp. 83, 88). Mindfulness canbe understood in terms of the quality and al-location of scarce attention, the repertoire ofaction capabilities, or active informationsearching (Westrum 1997), but is perhaps mostconcisely described by Vaughan (1996, chap-ter 4) as “interpretive work directed at weaksignals.” In contrast,

when fewer cognitive processes are activatedless often, the resulting state is one of mind-lessness characterized by reliance on past cate-gories, acting on “autopilot,” and fixation ona single perspective without awareness thatthings could be otherwise. . . . [T]o say that anorganization is drifting toward mindlessnessis simply another way of saying that the or-ganization is drifting toward inertia withoutconsideration that things could be different(Weick et al. 1999, p. 91).

If many culture-as-cause analyses describesafety culture shaping members’ safety atti-tudes and behavior, HRO analyses adopt aless reductionist or determinist epistemology

(but see Klein et al. 1995). Nonetheless, slip-pages toward instrumental conceptions of cul-ture and aspirations for homogeneously dis-tributed cognitive capacities also appear in thecorpus of HRO scholarship. Because the HROis offered as a model for reliably safe perfor-mance, it becomes essential to operationalizewith increasing specificity the particular mech-anisms that will ensure that performance, push-ing a less reductionist model of culture towardthe same limitations as the culture-as-causeliterature. Although many organizations sharethe named characteristics of high reliability,these are apparently insufficient for prevent-ing accidents because not all such organiza-tions are reliably safe. When organizationsthat ought to be highly reliable—because theyexhibit the specified characteristics—are not,authors either claim poor management or al-ter the criteria of success. What is now calledhigh-reliability theory (HRT) is a response thattransforms a prescription into a hypothesis, insome cases by transforming the independentvariables naming organizational processes, andat other times reframing the dependent variableor the definition of reliability. Early formula-tions emphasized the total elimination of error,absence of trial-and-error learning, a closed sys-tem buffered from environmental stresses, anda singular focus on safety (Weick 1987, Weick &Roberts 1993). Later versions basically invertedthe criteria to value the role of trial-and-errorlearning, learning from failures, the importanceof exogenous influences such as regulations andpublic perception, and the importance of mul-tiple objectives alongside safety (e.g., safety andservice) (LaPorte & Consolini 1991, LaPorte& Rochlin 1994).

Empirical tests of HRT have challenged itsown reliability as well as validity. For exam-ple, Klein et al. (1995) compared the organiza-tional cultures in a range of HROs to other or-ganizations. Unlike most organizations, HROsshowed few hierarchical differences in culturalnorms, although there were differences acrossHROs. In a study restricted to two nuclearplants, Bourrier (1996) found that the “orga-nizations use quite different strategies in their

354 Silbey

Ann

u. R

ev. S

ocio

l. 20

09.3

5:34

1-36

9. D

ownl

oade

d fro

m a

rjour

nals.

annu

alre

view

s.org

by M

ASS

ACH

USE

TTS

INST

ITU

TE O

F TE

CHN

OLO

GY

on

09/1

0/09

. For

per

sona

l use

onl

y.

Page 15: Taming Prometheus: Talk About Safety and Cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf · 2009-10-27 · arship about safety culture. Since the 1990s, identifying broken or

ANRV381-SO35-17 ARI 5 June 2009 9:28

search for reliability and effectiveness,” includ-ing coordination of workers and structuring oftasks, variables specifically excluded in the basicHRO model. Although the lack of hierarchicalvariation emphasized the cultural homogene-ity within individual organizations, researchfailed to demonstrate a shared culture acrossorganizations.

In perhaps the most important empiricaltests, Sagan (1993) failed to substantiate HRT’sfundamental premises. Using Freedom of In-formation Act petitions, Sagan scoured previ-ously classified archives to discover why therehave been no unintended explosions of nu-clear weapons. Is this a uniquely safe system—amodel of high reliability? Sagan’s analysis of twomoments of near nuclear war (Cuban MissileCrisis, 1973 October defcon alert) and the lossof a nuclear armed aircraft (1968 bomber crashnear Thule Air Base, Greenland) reveals thatthe system is anything but reliably safe. Eventhe necessary, if not sufficient, conditions forHRO failed; there were no accidents, how-ever, although there were repeated near misses.Sagan’s data discredit the fundamental featuresof the high-reliability model. He argues thatredundancy, promoted by HRT to prevent ac-cidents, is often the cause of problems, espe-cially when redundancy is added on rather thandesigned into systems. The bomber crash wasthe result of planned redundancy that was itselfthe source of near disaster. Planes loaded withnuclear weapons routinely fly as a form of a “re-dundant triad of U.S. strategic bombers, sub-marine launch ballistic missiles (SLBMs) andintercontinental ballistic missiles (ICBMs), toensure that retaliation would be possible un-der any conceivable circumstances in which theU.S. might be attacked” (Sagan 1993, p. 157).When a Strategic Air Command bomber withnuclear weapons crashed on January 21, 1968,“the conventional high explosive in all four ofthe nuclear bombs went off. No nuclear det-onation occurred but radioactive debris wasdispersed over a wide expanse” (Sagan 1993,p. 156; 1996). This near miss would not haveoccurred without the built-in redundancy that

increased the number of nuclear weapons rou-tinely deployed and circulating the globe.

Sagan (1993) also identifies conventionalfeatures of organizations that impede learn-ing, for example, the persistence and limitationsof bounded rationality that pervades “garbagecan” processes (Cohen et al. 1972). These adap-tive, yet unscriptable decision-making practicesprevail in many complex, porous organizationswhere unstable environments, unclear goals,misunderstanding, mis-learning, and happen-stance prevail.

Sagan shows that in each of the high-alert moments of crisis, protocols for ensur-ing against accidental detonation were violated.Because the supposed protections were not op-erating, only chance prevented nuclear detona-tion. He demonstrates that there was no learn-ing among analysts or high-level officers fromone incident to the next. More importantly,from my perspective, and a point I return tobelow, Sagan’s work stresses the importance ofcompeting group interests that undermined notonly commitments to safety, but also HRT’snotions of homogeneous organizational cultureand self-reflexive learning. Group rivalries ledto limited communication, burying informa-tion about what actually happened and imped-ing development of shared interpretations thatcan promote improvement over time. Concernsabout organizational surveillance, fear of pub-licity about near misses, and inter- and intraser-vice rivalries produced a culture of informa-tional secrecy among the military that leads toeven more near misses10 (cf. Galison 2004).

Culture as Emergentand IndeterminateIf optimism characterizes HRT, in effect sug-gesting that “if we only try harder we will have

10It may be worth remembering that just this kind of orga-nizational competition and secrecy among law enforcementagencies contributed to the failure to respond effectively tointelligence information prior to the 9/11 World Trade Cen-ter disaster.

www.annualreviews.org • Safety and Culture 355

Ann

u. R

ev. S

ocio

l. 20

09.3

5:34

1-36

9. D

ownl

oade

d fro

m a

rjour

nals.

annu

alre

view

s.org

by M

ASS

ACH

USE

TTS

INST

ITU

TE O

F TE

CHN

OLO

GY

on

09/1

0/09

. For

per

sona

l use

onl

y.

Page 16: Taming Prometheus: Talk About Safety and Cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf · 2009-10-27 · arship about safety culture. Since the 1990s, identifying broken or

ANRV381-SO35-17 ARI 5 June 2009 9:28

virtually accident-free systems,” more skepticalscholars believe “that no matter how hard wetry we will still have accidents because of intrin-sic characteristics of complex/coupled systems”(Perrow 1999 [1984], p. 369). For thosewho eschew reductionist and instrumentalconceptions, culture is understood to beemergent and indeterminate, an indissolubledialectic of system and practice. As such, theconsequences of safety culture cannot be engi-neered and only probabilistically predicted withhigh variation from certainty. For scholars whoadopt this constitutive perspective, “safety as aform of organizational expertise is . . . situatedin the system of ongoing practices. . . . [S]afety-related knowledge is constituted, institu-tionalized, and continually redefined andrenegotiated within the organizing processthrough the interplay between action andreflexivity.” Here, safety practices have “bothexplicit and tacit dimensions, [are] relationaland mediated by artifacts, . . . material as wellas mental and representational” (Gherardi &Nicolini 2000, p. 329). Rather than a specificorganization of roles and learning processesor a measurable set of attitudes and beliefs,safety is understood as an elusive, inspirationalasymptote, and more often only one of a num-ber of competing organizational objectives.

In his original formulation of the theory ofnormal accidents, Perrow (1999 [1984], p. 94)identified those intrinsic characteristics of so-ciotechnical systems that challenge aspirationsto total safety, breeding failure and catastro-phe. Where system components are complexlyorganized (i.e., with many interacting param-eters and subsystems, indirect and inferentialsources of information, feedback loops, and per-sonnel isolation), tight coupling among the sub-units undermines the ability to recover frominevitable malfunctions. Tightly coupled sys-tems have little slack and more invariant se-quences in time-dependent processes, usuallypermitting “only one way to reach the pro-duction goal.” Thus, when things go wrong,and they always do, if only because of the vari-ability in component life spans or unobservedfaults in minor or major parts, these tightly

coupled and complex systems have limited tem-poral slack, substitutability, and response op-tions. This conception of normal accidentsdirectly challenges the high-reliability modelof intense discipline, rigid socialization, andisolation. However, even in analyzing militaryorganizations, which emphasize discipline, so-cialization, and relative isolation from environ-mental contamination, Sagan successfully chal-lenged HRT and extended the normal accidentsmodel by emphasizing issues of bounded ra-tionality and interest competition within andbetween organizations. Finally, in a very dif-ferent setting (the 1980s savings and loan cri-sis), Mezias (1994) also showed how tight cou-pling and complexity increased simultaneouslyto produce catastrophic results in a nonmechan-ical but nonetheless complex technology.11

In a series of thickly described accounts,Vaughan (1996, 1999, 2003, 2004, 2005a,b,2006) has provided close, carefully nuancedanalyses of how the routine features of bu-reaucratic organizations that make for effec-tive coordination across persons, times, andtasks nonetheless lead to mistakes, misconduct,and disaster. Bridging micro and macro per-spectives, Vaughan’s work proposes a seriesof mechanisms that prevent well-intentionedactors and well-designed organizations fromachieving desired objectives. Despite signifi-cant differences between loosely coupled net-works of heterogeneously distributed, and of-ten collegially connected, communities of di-verse participants (Hughes 1998) and tightlycoupled complex systems (Perrow 1999 [1984]),both display consistent cognitive patterns thatundermine safety and render accidents normal.Vaughan (1999) describes these practices as the“dark side of organizations.” However, ratherthan focusing on hidden information, Vaughanemphasizes the interpretive flexibility in allprocesses; she demonstrates how the cognitive

11I write this essay as the financial crisis of 2008 is unfold-ing. One cannot help but notice the unfortunate parallelswith previous moments when supposedly expert technolo-gies failed to perform as their promoters and beneficiariesinsisted they would.

356 Silbey

Ann

u. R

ev. S

ocio

l. 20

09.3

5:34

1-36

9. D

ownl

oade

d fro

m a

rjour

nals.

annu

alre

view

s.org

by M

ASS

ACH

USE

TTS

INST

ITU

TE O

F TE

CHN

OLO

GY

on

09/1

0/09

. For

per

sona

l use

onl

y.

Page 17: Taming Prometheus: Talk About Safety and Cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf · 2009-10-27 · arship about safety culture. Since the 1990s, identifying broken or

ANRV381-SO35-17 ARI 5 June 2009 9:28

construction of situations is at the heart of thesafety problem and why organizations do notlearn from their mistakes.

Unusual events and accidents are generatedby the same cognitive processes that enablethe ordinary, routine interactions of daily life.Vaughan documents the ways that participantsinterpret uncertain and anomalous eventsas routine, thus failing to identify emergingdisasters. This interpretive construction is anirreducible feature of organizational processesbecause it is also a necessary feature of all socialaction. Cognitive processes homogenize, ornormalize, across differences, so that each eventis not perceived as unique but is categorized,and responded to, as an example of somethingknown and familiar for which interpretationsand established responses exist. Thus, actioncan proceed across time and space rather thanas if each moment, phenomenon, or interactionwas being experienced for the first time. (Thisis exactly the import of defining culture as bothsystem and practice. Each act is interpretableonly as part of a system; the system is producedthrough myriad individual actions.) However,routinized habits and tacit knowledge that arefundamental constituents of sociality—mechanisms for assembling the social(Latour 2005)—also efface particular dif-ferences that can, and sometimes do, havecatastrophic consequences.

Within rich ethnographies of sociotechni-cal systems, scholars display the local enact-ment of more general representational prac-tices that constitute, reconstitute, and reformthe cultural system in which safety is valued, ifnot consistently achieved (Clarke 1989, 1999;Gusterson 1998; Perin 2005; Vaughan 1996).This research varies by the specific foci of inter-pretation: on artifacts and objects (Hilgartner1992, Schein 1992) including, for example, ra-diation (Hacker 1987), asbestos (Maines 2005),o-rings (Gieryn & Figert 1990), oil (Beamish2002), system conditions or signals (e.g., Perin2005, Walker 2004), the significance of a par-ticular event (Galison 1997, pp. 352–62; Gieryn& Figert 1990), repeated events (Sagan 1993;Vaughan 2003, 2005a), or imagined events (e.g.,

Eden 2006). In these studies, researchers de-scribe how:

1. Linguistic schema, formal categories,embedded norms, and familiar artifactsprovide both fixed and flexible frames ofreference with which people apprehendand interpret information system perfor-mances, risks, and safety (Clarke 1993,Heimer 1988, Kahneman et al. 1982,Pfohl 1978, Starbuck & Milliken 1988).

2. Information that might shape more cau-tious and responsive interpretations is of-ten missing, actively buried (Sagan 1993),or discredited (Vaughan 1996, 2003).Some knowledge is removed or seg-mented by the distributed work pro-cesses and organizational norms of se-crecy that impede the communication orunderstanding that is vital for our safety(Galison 2004).

3. Dangers that are neither spectacular,sudden, nor disastrous, or that do notresonate with symbolic fears, can remainignored and unattended, and as a conse-quence are not interpreted or respondedto as safety hazards (Alvarez & Arends2000, Brown & Mikkelsen 1990, Glassner1999). For example, Beamish (2002)describes oil spilling continuously for38 years in the Guadalupe Dunes betweenLos Angeles and San Francisco and howagencies geared to answer dramatic andsudden pollution events lacked the frame-works to recognize or tools to respond toongoing, routine environmental degra-dation by continuously leaking pipes.From the 1950s until the 1990s, the spillwas ignored, existing physically but notin any organizationally cognizable form.Because it was continuous, ongoing for20 years or more, it was in effect routineand interpreted as such. However, whenthe spill was no longer just a set ofdistributed puddles but began to appearin the ocean nearby, “impressions of whatwas normal quickly changed” (Beamish2000, p. 481). Yet, because there wasno category for perceiving, naming, and

www.annualreviews.org • Safety and Culture 357

Ann

u. R

ev. S

ocio

l. 20

09.3

5:34

1-36

9. D

ownl

oade

d fro

m a

rjour

nals.

annu

alre

view

s.org

by M

ASS

ACH

USE

TTS

INST

ITU

TE O

F TE

CHN

OLO

GY

on

09/1

0/09

. For

per

sona

l use

onl

y.

Page 18: Taming Prometheus: Talk About Safety and Cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf · 2009-10-27 · arship about safety culture. Since the 1990s, identifying broken or

ANRV381-SO35-17 ARI 5 June 2009 9:28

responding to slow, long-term environ-mental degradation, the spill was againnormalized through that part of the oilfield subculture that expected rapidlyunfolding disasters. It was reinterpretedas an emergency, an available categoryfor those with expertise in short-lived butpossibly big oil spills. Interpreted as anemergency, the experts responded withstandardized responses that turned outto be inappropriate for the long-termsystem failure and environmental degra-dation that had actually taken place.Vaughan’s (2003, 2005b) analyses ofthe Challenger and Columbia disastersprovide parallel examples of situationsin which patterned, systemic condi-tions are repeatedly misperceived andmisinterpreted and, when disaster strikes,are reinterpreted once again, in thesecases oppositely, not as emergenciesbut as random, incidental, contingentoccurrences rather than the product oflong-term systemic processes.

4. Organizational structures, roles, and rou-tines shape interpretations so that differ-ent organizational routines produce verydifferent understandings of risk and er-ror. In a comparison of NASA’s organi-zational structure with the FAA (FederalAviation Administration, National AirTraffic System), Vaughan (2005a) showshow invariant and open discussion ofthe most minor variations or mishaps inthe Air Traffic Control system and notin NASA facilitates effective self-scrutinyand sensitivity to mishap.

5. The larger macrosocietal and popularculture embeds particular interpretationsof risk and safety (Douglas 1985, Douglas& Wildavsky 1982, Giddens 1999), andrepeated organizational and institutionalfailures breed generalized and dispropor-tionate fear and uncertainty. For exam-ple, Glassner (1999) argues that fearsare generally focused on the wrongthings: on chimerical dangers such as

“superbugs,” “killer kids,” or “teenagemoms” rather than more immediate, em-pirically demonstrable threats to well-being and safety such as poverty or guns.He suggests that the media, ever insearch of salacious stories that will in-crease market share, are simultaneouslythe promulgators and debunkers of fear-mongering. He suggests that misplacedfears are propagated by those who seekto profit by selling protections againstthat which is feared, generating both de-mand (creating fears) and supply (safety).In contrast, Clarke (2006) suggests thatalthough the harbingers of impendingcatastrophe are more reasonable and pre-scient than many imagine, the “ubiquityof worst cases . . . renders them ordinaryand mundane”—no longer able to shock.Cumulatively, there is a loss of public trustand an increase in the likelihood of insti-tutional failure (Freudenberg 1993); thisgeneralized loss of trust in institutions ex-plains three times as much of the variationin public fears as do sociodemographicand ideological variables.

CONCEPTUAL CONUNDRUMS,IDEOLOGICAL ELISIONS,AND STRUCTURAL SUPPORTSFOR SAFETYTalk about safety culture presents a series ofconundrums. First, “safety is defined and mea-sured more by its absence than by its presence”(Reason 1999, p. 4); we are safe because thereare no accidents. As non-events, we pay little at-tention to near misses. “Belief in the attainabil-ity of absolute safety . . . impede[s] the achieve-ment of realizable safety goals” (Perrow 2007;Reason 1999, p. 11). By attempting to institu-tionalize an absence (no accidents), safety cul-ture chases an ever-receding chimera, observ-able only when it ceases to exist. If absolutesafety is chimerical, and if systems are neverperfect, some suggest that research should

358 Silbey

Ann

u. R

ev. S

ocio

l. 20

09.3

5:34

1-36

9. D

ownl

oade

d fro

m a

rjour

nals.

annu

alre

view

s.org

by M

ASS

ACH

USE

TTS

INST

ITU

TE O

F TE

CHN

OLO

GY

on

09/1

0/09

. For

per

sona

l use

onl

y.

Page 19: Taming Prometheus: Talk About Safety and Cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf · 2009-10-27 · arship about safety culture. Since the 1990s, identifying broken or

ANRV381-SO35-17 ARI 5 June 2009 9:28

focus instead on adaptability and resilience. Forexample, Dekker (2006, pp. 83, 86) calls for re-search on “the drift into failure,” that is, theways in which a system’s protective mechanismsslowly push it toward the boundary betweenresilient adaptability and failure. Rather thanwait for the safety silence to be broken andrather than model safety failures, Dekker sug-gests that engineers should model the ordinaryroutines and micro decisions that often do notlead to failure but may nonetheless be “linkedto macrolevel drift.” Thus, some resilience en-gineers call for close, detailed observation of so-ciotechnical systems, for studies of the underly-ing dynamic relationships within organizations,especially on decisions that would relax produc-tion pressures and consequent risks. To become“wiser in the ways of the world” (Perin 2005,p. xix), engineers should learn “somethingmeaningful about insider interpretations, aboutpeople’s changing (or fixed) beliefs and howthey do or do not act on them” (Dekker 2006,p. 86), and in this way achieve an observable(resilience), rather than spectral (safety), systemcondition. Eschewing the reductionist concep-tions of safety culture, resilience engineeringjoins the interpretive turn and ends up call-ing for thick description; engineering becomesethnography.

Second, “measures designed to enhancea system’s safety—defenses, barriers, andsafeguards—can also bring about its destruc-tion” (Reason 1999, p. 6). Although “mostengineering-based organizations believe thatsafety is best achieved through a prede-termined consistency of their processes andbehaviors, . . . it is the uniquely human ability tovary and adapt actions to suit local conditionsthat preserves system safety in a dynamic anduncertain world” (Reason 1999, p. 9; Hollnagelet al. 2006). Thus, organizations routinelysucceed, and recover from near disaster,because workers do not follow predeterminedprotocols or designs; instead, they interpretrules and recipes, adapt resources to innovativeuses, develop work-arounds, and invent insitu many of the routines that ultimately

come to constitute the system in practice.Despite this well-documented understandingof organizational behavior, many engineeringmodels fail to describe the way work is actuallydone, offering instead what turn out to belargely imaginary accounts of work and systemperformance (Sosa et al. 2003). [Pilot trainingis a notable exception, routinely instructingpilots to differentiate when to follow or breakprotocol (Galison 2004, Gladwell 2008).]

Even some who promote resilience in placeof safety culture engineering and recognizesafety to be an emergent system property—context rather than cause—offer what turn outto be merely more complex models of dynami-cally intercollated feedback loops. Like Dekker(2006), Leveson et al. (2006) also suggest that aresilient system should include systematic self-reflection. They differ, however, by proposingnot deep ethnography, nor skepticism concern-ing information and communication, but me-chanical observation, modeled in terms of a par-allel control system that adjusts for variation inthe development and behavior of a system fromthe engineered design. By constant comparisonof behavior to design, Leveson et al. suggestthat we can build resilience into safety-criticalsystems. They adapt standard cybernetic mod-els with system-dynamic models that continu-ously and automatically modify system speci-fications. The research displays pervasive andpersistent refusal to accept the basic feature ofcomplex organizations and sociotechnical sys-tems: They are continually in the making, con-structed and reconstructed in every momentwith every act. Each new safety process or pro-cedure, each specification of the system, rein-stantiates the system that was into somethingthat is—something new, if not different (Ewick& Silbey 1998, pp. 43–44). Because adjustmentto the new model always includes some adap-tation (with implied variation and innovation),specification of the system is always pushingagainst an asymptotic aspiration of full informa-tion. Thus, there remains an unwarranted con-fidence in the ability to marshal information, aswell as its credibility (Perin 2005), to control

www.annualreviews.org • Safety and Culture 359

Ann

u. R

ev. S

ocio

l. 20

09.3

5:34

1-36

9. D

ownl

oade

d fro

m a

rjour

nals.

annu

alre

view

s.org

by M

ASS

ACH

USE

TTS

INST

ITU

TE O

F TE

CHN

OLO

GY

on

09/1

0/09

. For

per

sona

l use

onl

y.

Page 20: Taming Prometheus: Talk About Safety and Cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf · 2009-10-27 · arship about safety culture. Since the 1990s, identifying broken or

ANRV381-SO35-17 ARI 5 June 2009 9:28

system behavior that is belied by the history ofsystem failure, inviting, Perrow (2007) claims,the next catastrophe.12

Quite noticeably, the discussions of safetyculture ignore those features of complex or-ganizations and technological systems fromwhich cultural schemas and interpretations of-ten emerge: normative heterogeneity, compet-itive and conflicting interests, and inequalitiesin power and authority. Thus, what is specifi-cally missing from accounts of safety culture isattention to the mechanisms and processes thatproduce systemic meanings, including under-standings of risk, safety, authority, and control.A reflexive, historically grounded, empirical re-search agenda should address these issues. I of-fer two suggestions about what such approachesmight explore.

Challenging HegemonicNormalizationResearch on accidents and disasters has re-peatedly demonstrated what sociologists haveknown for close to a century: All purposive so-cial action has unintended consequences, and,although social action is inherently variable, so-cial solidarity and coordination are sustainedby perceptually, conceptually, and morally nor-malizing the variation. Thus, we fail to dis-tinguish novel or threatening from familiarand manageable events, productively innovativefrom functionally destructive deviance. This istrue in simple as well as complex relationshipsand situations. This is true in planning and inimplementation. Thus, in studies of hazardoustechnologies, researchers have documented theways in which the most rational and rigorousanalysts regularly fail to imagine contingencies

12Clarke & Short (1993, p. 375) suggest an additional co-nundrum deriving from the fact that we must respond toaccidents and “disasters through organizations that may beprecisely the wrong social instruments for such response” andmay themselves be an independent source of risk. “Organi-zations are built on predictability, but accidents by definitioninvolve unpredictability. . . . Organizations are organized tobe inflexible,” when flexibility is exactly what unpredictabil-ity requires (p. 392).

that later generate catastrophic hazards. For ex-ample, in his study of nuclear weapon deploy-ment, Sagan (1993) noted that (a) Americanradar installations had been installed acrossthe northern hemisphere to observe possiblelaunches from the Soviet Union and could notobserve or monitor missile launches from thesouthern hemisphere, including launches fromCuba during the missile crisis of October 1962;(b) strategic planning and nuclear safety re-lied on a design that limited detonation inthe absence of at least two independent deci-sion makers, but during the crisis of October1962, planes were launched with armed nuclearweapons, with only a single pilot having theability to detonate; and (c) planners generatedat least 10 scenarios of common-mode failures13

that might provoke an unintended detonationof a nuclear weapon; in none of these imaginar-ies did the analysts conceive the configurationof events that actually occurred in 1968 whena B-52 bomber crashed near Thule Air Base,Greenland, with four thermonuclear weaponsaboard. [The conventional high-explosive ma-terials detonated upon impact, but the bombhad been designed to withstand the heat andpressure of a crash. “This important safety fea-ture worked” (Sagan 1993, p. 180).] Confidentthat the enemy was on the other side of theglobe rather than 90 miles away, that redundantsecurity systems reinforced rather than under-mined each other, and that the collective imag-ination of the defense planners could antici-pate any confluence of events, the United Statesmanaged only accidentally to avoid unintendednuclear detonation.

In similar lines of analysis, Perin (2005,p. 5) describes how nuclear power plannershad imagined myriad possible problems but notwhat actually occurred in 2002. Although “leakshad been a generic problem known to the indus-try since 1990,” at the Davis-Besse Station onLake Erie near Toledo, Ohio, in 2002, leaks had

13Common-mode refers to a system component that servesmultiple other components such that, if it fails, the othermodes or components also fail.

360 Silbey

Ann

u. R

ev. S

ocio

l. 20

09.3

5:34

1-36

9. D

ownl

oade

d fro

m a

rjour

nals.

annu

alre

view

s.org

by M

ASS

ACH

USE

TTS

INST

ITU

TE O

F TE

CHN

OLO

GY

on

09/1

0/09

. For

per

sona

l use

onl

y.

Page 21: Taming Prometheus: Talk About Safety and Cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf · 2009-10-27 · arship about safety culture. Since the 1990s, identifying broken or

ANRV381-SO35-17 ARI 5 June 2009 9:28

eaten “completely through the 6.63 inch carbonsteel [vessel head] down to a thin [3/16 inch] in-ternal liner of stainless steel.” No engineer hadever “considered that nozzle lead deposits couldeat into the carbon steel of the reactor vessel.”

Finally, Eden (2006) shows how, for morethan half a century, government analysts failedto predict and plan for the consequences of nu-clear fire during a nuclear war. Because organi-zations focus on and try to institutionalize whatthey do well, they often fail to value what liesoutside their normal view and capacities. Hav-ing developed expertise in precision bombing,the Air Force also developed parallel skill inpredicting accuracy and damage, but failed toimagine the fire that would follow, althoughmuch of the World War II bombing damagewas due to fire. Working only with what theyknew best and for which they had secure bud-gets, they produced a rather poor representa-tion of the world as it had been (e.g., in Dresden,Hiroshima, and Nagasaki) or might be.

To challenge the processes of normaliza-tion that impede recognition of hazardouslydeviant events, future research might attemptto map more systematically not only the ubiq-uity of and variations within such processes,but most importantly the conditions and re-sources that challenge hegemonic normaliza-tion (Ewick & Silbey 1995, 2003). If hegemonyrefers to that which is unthinkable, and safetydemands seeing what is not there—an accidentin the making—then research needs to iden-tify the processes that successfully unsettle or-ganizational routines to make the unthinkablecognizable and the invisible apparent. Recallingthat hegemony is what goes without saying, byarticulating what is taken for granted and con-ventionally unspoken, closely observed ethnog-raphy can identify the moments in which criticalself-reflection emerges to unsettle conventionand make space for innovative practices (Kelty2008, Suchman 2006). From her deep ethnog-raphy of nuclear power plants, Perin (2005)identified moments when unexpected knowl-edge flowed from one group to another, whenoutside observers brought new perspectives onroutines, and when in-house meetings provided

opportunities for sharing concerns about dis-tractions. In her reimagined culture of con-trol, information channels would be laid acrossfunctional boundaries, and observational andinterpretive competencies would become highpriority for staff.

Power Differentials andStructured InequalityOne is hard-pressed to find a reference topower, group interests, conflict, or inequal-ity in the literature promoting safety cul-ture. This may be the most striking featureof this research field. This is not to say thatthere is no recognition of hierarchy. Indeed,the proponents of safety culture recognize thegreater authority and resources of top-levelmanagement and recommend using it to insti-tute organizational change from the top down,mandated by organizational leaders, even if de-signed by hired consultants. Indeed, the con-sistent valorization of clear lines of hierarchyaccompanies a surprising failure to see howthis same hierarchy undermines communica-tion and self-reflection about hazards. Recog-nizing the greater power of management, safetyculture advocates nonetheless fail to adequatelyrecognize the diminished power of those insubordinate positions (Edwards 1979, Hodson2001). As a consequence, organizations oftenattempt to institute safety culture by addressingonly one facet of the organization at a time, forexample, people’s attitudes, behaviors, coordi-nating structures (Cooper 2000), managementmessages, or organizational symbols, withoutconsidering dependencies and interdependen-cies. Vaughan (1996, 2003) and Perin (2005) re-fer at length to the dysfunctional safety con-sequences of the hierarchical credibility gapthat derives from the embedded, but unac-knowledged, stratification. Lower-level actorsare often repositories of critical information andcounterhegemonic views, yet are often unableto persuade higher-ups in the organization ofeither the credibility of their knowledge or rel-evance of their perspectives. To the extent thatthe consequences of hazardous technologies are

www.annualreviews.org • Safety and Culture 361

Ann

u. R

ev. S

ocio

l. 20

09.3

5:34

1-36

9. D

ownl

oade

d fro

m a

rjour

nals.

annu

alre

view

s.org

by M

ASS

ACH

USE

TTS

INST

ITU

TE O

F TE

CHN

OLO

GY

on

09/1

0/09

. For

per

sona

l use

onl

y.

Page 22: Taming Prometheus: Talk About Safety and Cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf · 2009-10-27 · arship about safety culture. Since the 1990s, identifying broken or

ANRV381-SO35-17 ARI 5 June 2009 9:28

promoted and managed by advocating safetyculture that elides issues of power and inequal-ity, it becomes, intended or not, an ideologicalproject (Silbey 1998).

Proponents of safety culture ignore the factthat although safety has mutual benefits, all arenot made equally better off, even from a mu-tually beneficial objective. If management doesnot regard workers as substitutable costs, whosefunctions are more economically purchasedthrough outsourcing, an opposite managementtheory imagines that all members of the orga-nization are similarly situated, with commen-surate interests, trust, and loyalty. From thislatter, enlightened leadership perspective, cor-porate managers are safety stewards promotinga generally shared and valued objective and canthus expect organizational members to followenthusiastically; after all, we all benefit fromincreased safety. Yet the differential interestsof upper-level managers and lower-level work-ers are systematically elided through popularideologies and representational practices thatinsist on our mutual self-interest. By assum-ing a similarity of interest, managers fail toaddress the differential positions and, more im-portantly, fail to recognize the differential re-sources workers bring to the organization thatcan be mobilized in the service of greater safety.

Moreover, with the decline in unioniza-tion, lower-level workers lack the institution-alized base from which to make their voicesheard in their own interest or in the inter-ests of the firm, including their knowledge ofsafe or unsafe operating conditions. Researchhas shown, for example, that safety violations,accidents, and product defects increase withoutsourcing (Kochan et al. 1992). Because ofa lack of oversight, integration across func-tions, and an intimate knowledge of the pro-duction process, contract suppliers are unableto provide the mitigation of hazard that hadbeen supplied in the past through shop-floor,rather than upper-level-management, steward-ship. Furthermore, as firms spread more of theircapital risks through innovative financial instru-ments, they have inadvertently broken the or-ganizational field, one might say, by eschewing

the institutionalized mechanisms, such asworker expertise, oversight, and solidarity, thathelp mitigate risk. One cannot help notice, forexample, the rapidity with which particularlyhazardous technologies, such as oil refineries,are bought and sold. In the financialized world,a refinery is an asset, not an organization, nei-ther a community nor a complex system. Witheach shift in ownership comes a new manage-ment regime with a new set of procedures,policies, and practices; new IT systems; and dif-ferent safety regimes.

Attention to power and inequality suggestsseveral lines of research. Studies might pro-ductively bring literature on financial risk andmaterial hazards into conversation with eachother. Surely they have been interacting witheach other during the past decade, most ob-viously when changes in firm ownership bringchanges in personnel and policies. Future re-search might also explore how safety culturediscourse operates in different stratification sys-tems. For example, where there is deep povertyand low education, obvious human need mayoutweigh concerns about safety, and rationaldiscourse may function merely as a concessionto external or symbolic constituencies. Wherethere is more education and material abun-dance, talk of safety culture may obscure theinherent risks of complex systems, disguisingthem behind a facade of personal risk and indi-vidual deviance.

Most importantly, however, research shouldexplore ways in which differentially situated in-terests might be mobilized to produce counter-vailing power. Where the relation between thesource and victims of hazardous risks are boundby neither space (across geographic bound-aries), nor time (across generations), securityand safety certainly seem elusive (Beck 1992).Some recent litigation campaigns suggest tac-tics that, if not directly controlling hazardoussystems, might nonetheless highlight the linksamong dispersed organizations, technologies,and collectively as well as locally experiencedharms. Deterritorialized risks can be broughtto earth, so to speak, apprehended and localizedby mass or class action litigations, such as in the

362 Silbey

Ann

u. R

ev. S

ocio

l. 20

09.3

5:34

1-36

9. D

ownl

oade

d fro

m a

rjour

nals.

annu

alre

view

s.org

by M

ASS

ACH

USE

TTS

INST

ITU

TE O

F TE

CHN

OLO

GY

on

09/1

0/09

. For

per

sona

l use

onl

y.

Page 23: Taming Prometheus: Talk About Safety and Cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf · 2009-10-27 · arship about safety culture. Since the 1990s, identifying broken or

ANRV381-SO35-17 ARI 5 June 2009 9:28

asbestos and tobacco litigations or in lawsuitsagainst oil companies for global warming.

CONCLUSIONSafety culture is a particularly narrow at-tempt to tame Prometheus, where the centralproblematic—assembling the social (Latour2005)—is assumed rather than explored. In itsmost common invocations, safety culture be-comes either a thing or an ether. Culture names“what is left over after you forgot what it wasyou were originally trying to learn” (O’Reilly& Chatman 1996, p. 159). Rather than addressthe structural and historical conditions that ei-ther sustain or impede safe organizational per-formance, culture becomes a supplement, thedetritus of social transactions. As the phenom-ena continually recede before efforts to con-trol them, research advocating safety cultureseems, in the end, to suggest that responsibility

for the consequences of complex technolo-gies resides in a cultural ether, everywhere ornowhere. If the ether proves elusive, the ex-planation of operator error is always available.In seventeenth century England, when experi-ments went wrong in performances before theRoyal Society, the air-pump exploding for ex-ample, assistants and craftspeople were blamedfor the failure rather than the gentlemen sci-entists or even the artifacts themselves (Shapin& Schaffer 1985). Because technologies con-cretize the scientific theories and social rela-tions, hierarchy, and authority of the organi-zations assembled around them, it was difficultto point toward systemic failures in the appara-tus without undermining the scientist design-ers. Since the seventeenth century, Prometheushas become omnipresent, if not omnipotent,scientific authority even more secure; yet, fourcenturies on, we still focus on the assistant, fail-ing once again to tame Prometheus.

DISCLOSURE STATEMENTThe author is not aware of any biases that might be perceived as affecting the objectivity of thisreview.

ACKNOWLEDGMENTSI am particularly grateful to Joelle Evans and Ayn Cavicchi for their excellent research assistanceand to Sandy Brown for initiating my work on this topic. I also appreciate the comments andsuggestions I received at various stages of writing from Bernard Harcourt, Hugh Gusterson,Tom Baker, Ronen Shamir, Tanu Agrawal, Tanina Rostain, Douglas Goodman, Chris Kelty,Diane Vaughan, Don Lessard, Nelson Repenning, Nancy Leveson, Matt Richards, Xaq Frolich,Ruthanne Huising, John Paul Ferguson Keith Bybee, Carroll Seron, Michael Fischer, ConniePerin, Jean Jackson, Chihyung Jeon, Susan Sterrett, Jason Jay, Kieran Downes, Sophia Roosth,David Kaiser, and Stefan Helmreich.

LITERATURE CITED

Abbott AD. 1988. The System of Professions: An Essay on the Division of Expert Labor. Chicago: Univ. ChicagoPress

Alvarez R, Arends J. 2000. Fire, earth and water: an assessment of the environmental, safety and health im-pacts of the Cerro Grande fire on Los Alamos National Laboratory, a Department of Energy facility. Con-cerned Citizens for Nuclear Safety and Nuclear Policy Project. http://www.nuclearactive.org/docs/CerroGrandeindex.html

Associated Press. 1989. Inquiry finds laxity at root of 35 rail deaths last year. Assoc. Press, Nov. 7

www.annualreviews.org • Safety and Culture 363

Ann

u. R

ev. S

ocio

l. 20

09.3

5:34

1-36

9. D

ownl

oade

d fro

m a

rjour

nals.

annu

alre

view

s.org

by M

ASS

ACH

USE

TTS

INST

ITU

TE O

F TE

CHN

OLO

GY

on

09/1

0/09

. For

per

sona

l use

onl

y.

Page 24: Taming Prometheus: Talk About Safety and Cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf · 2009-10-27 · arship about safety culture. Since the 1990s, identifying broken or

ANRV381-SO35-17 ARI 5 June 2009 9:28

Baker Panel. 2007. The Report of the BP U.S. Refineries Independent Safety Review Panel. http://www.bp.com/bakerpanelreport

Baker T. 2002. Risk, insurance and the social construction of responsibility. See Baker & Simon 2002,pp. 33–51

Baker T. 2005. The Medical Malpractice Myth. Chicago: Univ. Chicago PressBaker T, Simon J, eds. 2002. Embracing Risk: The Changing Culture of Insurance and Responsibility. Chicago:

Univ. Chicago PressBallard GM. 1988. Nuclear Safety After Three Mile Island and Chernobyl. London: Elsevier Appl. Sci.Beamish TD. 2000. Accumulating trouble: complex organization, a culture-of-silence, and a secret spill. Soc.

Probl. 47(4):473–98Beamish TD. 2002. Silent Spill: The Organization of an Industrial Crisis. Cambridge, MA: MIT PressBeck U. 1992. Risk Society: Towards a New Modernity, transl. M Ritter. London: SageBecker H. 1963. The Outsiders: Studies in the Sociology of Deviance. New York: Free PressBernstein M. 1955. Regulating Business by Independent Commission. Princeton, NJ: Princeton Univ. PressBittner E. 1983. Technique and the conduct of life. Soc. Probl. 30:249–61Bourrier M. 1996. Organizing maintenance work at two American nuclear power plants. J. Conting. Crisis

Manag. 4:104–12Brooks K, Lazorko L, Zygmunt J, Hill J, Savage P, et al. 1986. Tighter controls on chemical hazards. Chem.

Week, Jan. 1/Jan. 8, p. 21Brown P, Mikkelsen EJ. 1990. No Safe Place: Toxic Waste, Leukemia and Community Action. Berkeley: Univ.

Calif. PressBurke TF. 2004. Lawyers, Lawsuits, and Legal Rights: The Battle over Litigation in American Society. Berkeley:

Univ. Calif. PressCarroll J. 1998a. Organizational learning activities in high-hazard industries: the logics underlying self-analysis.

J. Manag. Stud. 35:699–717Carroll J. 1998b. Safety culture as an ongoing process: culture surveys as opportunities for enquiry and change.

Work Stress 12:272–84Chabon M. 2008. In priceland. New York Rev. Books, May 1Ciaverelli A, Figlock R. 1996. Organizational factors in aviation accidents. In Proceedings of the Ninth Annual

Symposium on Aviation Psychology, pp. 1033–35. Columbus, OH: Dep. Aviat.Clarke L. 1989. Acceptable Risk?: Making Decisions in a Toxic Environment. Berkeley: Univ. Calif. PressClarke L. 1993. The disqualification heuristic: when do organizations misperceive risk? In Research in Social

Problems and Public Policy, ed. RT Youn, WF Freudenberg, 5:289–312. Greenwich, CT: JAIClarke L. 1999. Mission Improbable: Using Fantasy Documents to Tame Disaster. Chicago: Univ. Chicago PressClarke L. 2006. Worst Cases: Terror and Catastrophe in the Popular Imagination. Chicago: Univ. Chicago PressClarke L, Short JF. 1993. Social organization and risk: some current controversies. Annu. Rev. Sociol. 19:375–99Cohen MD, March JG, Olsen JP. 1972. A garbage can model of organizational choice. Adm. Sci. Q. 17:1–25Cooper MD. 2000. Towards a model of safety culture. Saf. Sci. 36:111–36Cox S, Cox T. 1991. The structure of employee attitudes to safety: a European example. Work Stress 5:93–106Davidson A. 1990. In the Wake of the Exxon Valdez: The Devasting Impact of the Alaska Oil Spill. San Francisco:

Sierra Club BooksDekker S. 2006. Resilience engineering: chronicling the emergence of confused consensus. See Hollnagel

et al. 2006, pp. 77–92Denison DR. 1996. What is the difference between organizational culture and organizational climate? A

native’s point of view on a decade of paradigm wars. Acad. Manag. Rev. 21:619–54Derthick M, Quirk PJ. 1985. The Politics of Deregulation. Washington, DC: Brookings Inst.Deutch JM, Lester RK. 2004. Making Technology Work: Applications in Energy and the Environment. Cambridge,

UK: Cambridge Univ. PressDiamond S. 1986. Chernobyl causing big revisions in global nuclear power policies. New York Times, Oct. 27,

p. 1Douglas M. 1985. Risk as a forensic resource. Daedalus 119:1–16Douglas M, Wildavsky A. 1982. Risk and Culture. Berkeley: Univ. Calif. Press

364 Silbey

Ann

u. R

ev. S

ocio

l. 20

09.3

5:34

1-36

9. D

ownl

oade

d fro

m a

rjour

nals.

annu

alre

view

s.org

by M

ASS

ACH

USE

TTS

INST

ITU

TE O

F TE

CHN

OLO

GY

on

09/1

0/09

. For

per

sona

l use

onl

y.

Page 25: Taming Prometheus: Talk About Safety and Cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf · 2009-10-27 · arship about safety culture. Since the 1990s, identifying broken or

ANRV381-SO35-17 ARI 5 June 2009 9:28

Eden L. 2006. Whole World on Fire: Organizations, Knowledge and Nuclear Weapons Devastation. Ithaca, NY:Cornell Univ. Press

Edwards R. 1979. Contested Terrain: The Transformation of the Workplace in the Twentieth Century. New York:Basic Books

Eisenhardt K. 1993. High reliability organizations meet high velocity environments: common dilemmas innuclear power plants, aircraft carriers, and microcomputer firms. In New Challenges Understanding Orga-nizations, ed. KH Roberts, pp. 117–35. New York: Macmillan

Ericson R, Doyle A, Barry D. 2003. Insurance as Governance. Toronto: Univ. Toronto PressErikson K. 1978. Everything in Its Path: Destruction of Community in the Buffalo Creek Flood. New York: Simon

& SchusterEwald F. 2002. The return of Descartes’s malicious demon: an outline of a philosophy of precaution. See

Baker & Simon 2002, pp. 273–301Ewick P, Silbey SS. 1995. Subversive stories and hegemonic tales: toward a sociology of narrative. Law Soc.

Rev. 29(2):197–226Ewick P, Silbey SS. 1998. The Common Place of Law: Stories from Everyday Life. Chicago: Univ. Chicago PressEwick P, Silbey SS. 2003. Narrating social structure: stories of resistance to legal authority. Am. J. Sociol.

109:1328–72Fischer MMJ. 2006. Culture and cultural analysis. Theory Cult. Soc. 23:360–64Fortun K. 2001. Advocacy After Bhopal. Chicago: Univ. Chicago PressFreudenburg WR. 1993. Risk and recreancy: Weber, the division of labor, and the rationality of risk perception.

Soc. Forces 71:909–32Friedman L. 1967. Legal rules and the process of social change. Stanford Law Rev. 19:786Friedman L. 1985. Total Justice: What Americans Want from the Legal System and Why. Boston: BeaconGalanter M. 1983. Reading the landscape of disputes: what we know and don’t know (and think we know)

about our allegedly contentious and litigious society. UCLA Law Rev. 31:4–71Galanter M. 1994. Predators and parasites: lawyer-bashing and civil justice. Ga. Law Rev. 28:633–81Galanter M. 2006. Law, anti-law and social science. Annu. Rev. Law Soc. Sci. 2:1–16Galison P. 1997. Image and Logic: A Material Culture of Microphysics. Chicago: Univ. Chicago PressGalison P. 2004. Removing knowledge. Crit. Inq. 31:229–43Geertz C. 1973. The Interpretation of Cultures: Selected Essays. New York: Basic BooksGeller ES. 1994. Ten principles for achieving a total safety culture. Prof. Saf. 39:18–24Gherardi S, Nicolini D. 2000. To transfer is to transform: the circulation of safety knowledge. Organization

7:329–48Giddens A. 1999. Risk and responsibility. Mod. Law Rev. 62(1):1–10Gieryn TF, Figert SE. 1990. Ingredients for a theory of science in society: O-rings, ice water, C-clamp,

Richard Feynman and the press. In Theories of Science in Society, ed. SE Cozzens, TF Gieryn, pp. 67–97.Bloomington: Ind. Univ. Press

Gladwell M. 2008. Outliers. Boston: Little BrownGlassner D. 1999. The Culture of Fear: Why Americans Are Afraid of the Wrong Things. New York: Basic BooksGlennon DP. 1982. Measuring organizational safety climate. Aust. Saf. News. Jan./Feb., pp. 23–28Guldenmund FW. 2000. The nature of safety culture: a review of theory and research. Saf. Sci. 34:215–57Gusterson H. 1998. Nuclear Rites. Berkeley: Univ. Calif. PressGusterson H. 2007. Anthropology and militarism. Annu. Rev. Anthropol. 36:155–75Habermas J. 1975. The Legitimation Crisis. Boston: BeaconHacker B. 1987. The Dragon’s Tail: Radiation Safety in the Manhattan Project, 1942–1946. Berkeley: Univ. Calif.

PressHacking I. 1990. The Taming of Chance. Cambridge, UK: Cambridge Univ. PressHacking I. 2003. Risk and dirt. In Risk and Morality, ed. R Ericson, A Doyle, pp. 22–47. Toronto: Univ. Toronto

PressHeimer C. 1985. Reactive Risk and Rational Action: Managing Moral Hazard in Insurance Contracts. Berkeley:

Univ. Calif. PressHeimer C. 1988. Social structure, psychology and the estimation of risk. Annu. Rev. Sociol. 14:491–519

www.annualreviews.org • Safety and Culture 365

Ann

u. R

ev. S

ocio

l. 20

09.3

5:34

1-36

9. D

ownl

oade

d fro

m a

rjour

nals.

annu

alre

view

s.org

by M

ASS

ACH

USE

TTS

INST

ITU

TE O

F TE

CHN

OLO

GY

on

09/1

0/09

. For

per

sona

l use

onl

y.

Page 26: Taming Prometheus: Talk About Safety and Cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf · 2009-10-27 · arship about safety culture. Since the 1990s, identifying broken or

ANRV381-SO35-17 ARI 5 June 2009 9:28

Helmreich S. 2001. After culture: reflections on the apparition of anthropology in artificial life, a science ofsimulation. Cult. Anthropol. 16:613–28

Hilgartner S. 1992. The social construction of risk objects: or, how to pry open networks of risk. Organizations,Uncertainties and Risk, ed. J Short, L Clarke, pp. 39–53. Boulder, CO: Westview

Hodson R. 2001. Dignity at Work. Cambridge, UK: Cambridge Univ. PressHollnagel E, Woods DD, Leveson N, eds. 2006. Resilience Engineering: Concepts and Precepts. Aldershot, UK:

AshgateHoltom W, McCann M. 2004. Distorting the Law: Politics, Media and the Litigation Crisis. Chicago: Univ.

Chicago PressHughes TP. 1998. Rescuing Prometheus. New York: Vintage BooksHumphrey SE, Nahrgan JD, Morgeson FP. 2007. Integrating motivational, social and contextual work design

features: a meta-analytic summary and theoretical extension of the work design literature. J. Appl. Psychol.92:1332–56

Int. Atomic Energy Agency (IAEA). 1991. Safety Culture. A Report by the International Nuclear Safety AdvisoryGroup. Safety Ser. 75-INSAG-4. Vienna, Austria: IAEA

Int. Atomic Energy Agency (IAEA). 1992. Information sheet. Tech. Comm. Meet. Impact Manag. Organ. SafeOoperation of Nuclear Power Plants. Ref. J4-TC-804. Vienna, Austria. Aug.–Sept.

Jacobs MD, Hanrahan NW, eds. 2005. The Blackwell Companion to the Sociology of Culture. Malden, MA:Blackwell

Jasanoff S, ed. 1994. Learning from Disaster. Philadelphia: Univ. Pa. PressJasanoff S. 2005. Designs on Nature: Science and Democracy in Europe and the United States. Princeton, NJ:

Princeton Univ. PressJoskow P, Noll R. 1977. Regulation in theory and practice. In Studies in Public Regulation, ed. G Fromm,

pp. 1–65. Cambridge, MA: MIT PressJoskow P, Rose N. 1989. The effects of economic regulation. In Handbook of Industrial Organization,

ed. R Schmalensee, R Willing, pp. 1450–506. Amsterdam: North-Holland, ElsevierKagan RA. 2003. Adversarial Legalism: The American Way of Law. Cambridge, MA: Harvard Univ. PressKahneman D, Slovic P, Tversky A, eds. 1982. Judgement Under Uncertainty: Heuristics and Biases. Cambridge,

UK: Cambridge Univ. PressKeeble J. 1991. Out of the Channel: The Exxon Valdez Oil Spill in Prince William Sound. New York: HarperCollinsKelty C. 2008. Allotropes of fieldwork in nanotechnology. In Emerging Conceptual, Ethical and Policy Issues in

Bionanotechnology, Vol. 101, ed. F Jotterand, pp. 157–80. Dordrecht/London: SpringerKiefer F. 1984. Bhopal tragedy resounds for other multinationals. Christ. Sci. Monit., Dec. 12, p. 1Klein N. 2007. The Shock Doctrine: The Rise of Disaster Capitalism. New York: Metropolitan BooksKlein RL, Bigley GA, Roberts KH. 1995. Organizational culture in high reliability organizations: an extension.

Hum. Relat. 48:771–93Knowles S. 2007a. Engineers in disaster: underwriters laboratories and the invention of risk engineering in the United

States. Unpubl. manuscr.Knowles S. 2007b. What is a disaster? Social science disaster research in the postwar United States. Unpubl. manuscr.Kochan T, Smith M, Wells JC, Rebitzer JB. 1992. Managing the safety of contingent workers: a study in contract

workers in the petrochemical industry. Sloan Sch. Work. Pap. 3442-92-BPS, Cambridge, MAKurzman D. 1987. A Killing Wind: Inside Union Carbide and the Bhopal Catastrophe. New York: McGraw-HillLaPorte T, Consolini P. 1991. Working in practice but not in theory: theoretical challenges of “high-reliability

organizations.” J. Public Adm. Res. Theory 1:19–48LaPorte TR, Rochlin G. 1994. A rejoinder to Perrow. J. Conting. Crisis Manag. 2:221–27Lash SB, Wynne B. 1992. Introduction. See Beck 1992, pp. 1–8Latour B. 2005. Reassembling the Social. Oxford: Oxford Univ. PressLee TR. 1996. Peceptions, attitudes and behaviour: the vital elements of a safety culture. Health Saf. Oct:1–15Leveson N, Dulac N, Zipkin D, Cutcher-Gershenfeld J, Carroll J, Barrett B. 2006. Engineering resilience

into safety-critical systems. See Hollnagel et al. 2006, pp. 95–123Lieberman JK. 1981. The Litigious Society. New York: Basic BooksMacKenzie D. 2006. An Engine Not a Camera: How Financial Models Shape Markets. Cambridge, MA: MIT

Press

366 Silbey

Ann

u. R

ev. S

ocio

l. 20

09.3

5:34

1-36

9. D

ownl

oade

d fro

m a

rjour

nals.

annu

alre

view

s.org

by M

ASS

ACH

USE

TTS

INST

ITU

TE O

F TE

CHN

OLO

GY

on

09/1

0/09

. For

per

sona

l use

onl

y.

Page 27: Taming Prometheus: Talk About Safety and Cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf · 2009-10-27 · arship about safety culture. Since the 1990s, identifying broken or

ANRV381-SO35-17 ARI 5 June 2009 9:28

Maines R. 2005. Asbestos and Fire: Technological Trade-Offs and the Body at Risk. New Brunswick, NJ: RutgersUniv. Press

Marone J, Woodhouse E. 1986. Averting Catastrophe: Strategies for Regulating Risky Technologies. Berkeley: Univ.Calif. Press

Medvedev ZA. 1992. The Legacy of Chernobyl. New York: W.W. NortonMezias S. 1994. Financial meltdown as normal accident: the case of the American savings and loan industry.

Acc. Organ. Soc. 19:181–92Morgeson FP, Humphrey SE. 2006. The work design questionaire (WDQ): developing and validating a

comprehensive measure for assessing job design and the nature of work. J. Appl. Psychol. 91:1321–39O’Reilly C, Chatman J. 1996. Culture as social control: corporations, cults, and commitment. Res. Organ.

Behav. 18:157–200Orren K. 1991. Belated Feudalism: Labor, Law and Liberal Development in the United States. Cambridge, UK:

Cambridge Univ. PressOstrom L, Wilhelmsen C, Kaplan B. 1993. Assessing safety culture. Nucl. Saf. 34:163–72Peltzman S, Levine ME, Noll RG. 1989. The economic theory of regulation after a decade of deregulation.

Brookings Pap. Econ. Act. Microecon. 1989:1–59Perin C. 2005. Shouldering Risks: The Culture of Control in the Nuclear Power Industry. Princeton, NJ: Princeton

Univ. PressPerrow C. 1999 (1984). Normal Accidents: Living with High-Risk Technologies. Princeton, NJ: Princeton Univ.

Press. 2nd ed.Perrow C. 2007. The Next Catastrophe: Reducing Our Vulnerabilities to Natural, Industrial, and Terrorist Disasters.

Princeton, NJ: Princeton Univ. PressPetryna A. 2002. Life Exposed: Biological Citizens After Chernobyl. Princeton, NJ: Princeton Univ. PressPfohl S. 1978. Predicting Dangerousness. Lexington, MA: Lexington BooksPidgeon NF. 1991. Safety culture and risk management in organizations. J. Cross Cult. Psychol. 22:129–40Pidgeon N. 1997. The limits to safety culture: politics, learning and man-made disasters. J. Conting. Crisis

Manag. 5:1–14Pidgeon N. 1998. Safety culture: key theoretical issues. Work Stress 13:202–16Reason JT. 1997. Managing the Risks of Organizational Accidents. Aldershot, UK: AshgateReason JT. 1999. Safety paradoxes and safety culture. Int. J. Inj. Control Saf. Promot. 7:3–14Rees J. 1994. Hostages of Each Other: The Transformation of Nuclear Safety Since Three Mile Island. Chicago: Univ.

Chicago PressRoberts KH, Rousseau DM. 1989. Research in nearly failure-free, high-reliability organizations. IEEE Trans.

Eng. Manag. 36:132–39Roberts KH, Rousseau DM, LaPorte TR. 1994. The culture of high reliability: quantitative and qualitative

assessment aboard nuclear powered aircraft carriers. J. High Technol. Manag. Res. 5:141–61Rochlin GI, LaPorte TR, Roberts KH. 1987. The self-designing high-reliability organization: aircraft carrier

flight operations at sea. Nav. War Coll. Rev. 49:76–90Rose N. 1989. Governing the Soul: The Shaping of the Private Self. London: RoutledgeRose N. 1999. The Powers of Freedom. Cambridge/New York: Cambridge Univ. PressSagan S. 1993. The Limits of Safety: Organizations, Accidents and Nuclear Weapons. Princeton, NJ: Princeton

Univ. PressSagan SD. 1996. When redundancy backfires: why organizations try harder and fail more often. Presented at Annu.

Meet. Am. Polit. Sci. Assoc., San Francisco, Sept.Schein EH. 1992. Organizational Culture and Leadership. San Francisco: Jossey-BassScherer N. 2005. Scoring Points: Politicians, Activists and the Lower Federal Court Appointment Process. Stanford,

CA: Stanford Univ. PressSchneiberg M, Bartley T. 2008. Organizations, regulation, and economic behavior: regulatory dynamics and

forms from the 19th to 21st century. Annu. Rev. Law Soc. Sci. 4:31–61Schulman P. 1993. The analysis of high reliability organizations: a comparative framework. In New Challenges

Understanding Organizations, ed. KH Roberts, pp. 33–54. New York: MacmillanSetbon M. 1993. Pouvoirs Contre Sida: De la Transfusion Sanguine au Depistage: Decisions et Pratique en France,

Grande-Bretagne et Suede. Paris: Ed. Seuil

www.annualreviews.org • Safety and Culture 367

Ann

u. R

ev. S

ocio

l. 20

09.3

5:34

1-36

9. D

ownl

oade

d fro

m a

rjour

nals.

annu

alre

view

s.org

by M

ASS

ACH

USE

TTS

INST

ITU

TE O

F TE

CHN

OLO

GY

on

09/1

0/09

. For

per

sona

l use

onl

y.

Page 28: Taming Prometheus: Talk About Safety and Cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf · 2009-10-27 · arship about safety culture. Since the 1990s, identifying broken or

ANRV381-SO35-17 ARI 5 June 2009 9:28

Sewell WH. 2005. Logics of History. Chicago: Univ. Chicago PressShamir R. 2006. Neo-liberal moment. Presented at Annu. Meet. Law Soc. Assoc., Las Vegas, MayShamir R. 2008. The age of responsibilization: on market-embedded morality. Econ. Soc. 37:1–19Shapin S, Schaffer S. 1985. The Leviathan and the Air-Pump. Princeton, NJ: Princteon Univ. PressSilbey J. 2001. Patterns of courtroom justice. J. Law Soc. 28:97Silbey J. 2004. Judges as film critics: new approaches to filmic evidence. Univ. Mich. J. Law Reform 37:493Silbey J. 2005. Filmmaking in the precinct house and the genre of documentary film. Columbia J. Law Arts

29:107Silbey J. 2007a. Criminal performances: film, autobiography and confession. New Mexico Law Rev. 37:189Silbey J. 2007b. Truth tales and trial films. Loyola Law Rev. 40:551Silbey SS. 1998. Ideology, power and justice. In Justice and Power in Socio-Legal Studies, ed. AD Sarat, B Garth,

pp. 272–308. Evanston, IL: Northwest. Univ. PressSilbey SS. 2001. Legal culture and consciousness. In International Encyclopedia of the Social and Behavioral Sciences,

pp. 8623–29. Amsterdam: Elsevier Sci.Silbey SS. 2005a. After legal consciousness. Annu. Rev. Law Soc. Sci. 1:323–68Silbey SS. 2005b. Everyday life and the constitution of legality. In The Blackwell Companion to the Sociology of

Culture, ed. M Jacobs, N Hanrahan, pp. 332–45. Oxford: BlackwellSilbey SS, Sarat A. 1988. Dispute processing in law and legal scholarship: from institutional critique to the

reconstitution of the juridical subject. Univ. Denver Law Rev. 66:437–98Simon J. 1988. The ideological effects of actuarial practices. Law Soc. Rev. 22:771–800Sosa M, Eppinger SD, Rowles C. 2003. The misalignment of product architecture and organizational structure

in complex product development. Manag. Sci. 50:1674–89Starbuck W, Milliken F. 1988. Challenger: fine-tuning the odds until something breaks. J. Manag. Stud.

25:319–40Stephens M. 1980. Three Mile Island. New York: Random HouseStern GM. 1976/2008. The Buffalo Creek Disaster. New York: Random HouseSuchman L. 2006. Human-Machine Configurations: Plans and Situated Actions. Cambridge, UK: Cambridge

Univ. PressSwidler A. 1986. Culture in action: symbols and strategies. Am. Sociol. Rev. 51:273–86U.S. Nucl. Regul. Comm. 1989. Policy statement on the conduct of nuclear power plant operations. 54 Fed.

Reg. 3424, Jan. 24Vaughan D. 1996. The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA. Chicago:

Chicago Univ. PressVaughan D. 1999. The dark side of organizations: mistake, misconduct and disaster. Annu. Rev. Sociol. 25:271–

305Vaughan D. 2003. History as cause: Columbia and Challenger. Chapter 8. Rep. Columbia Accid. Investig. Board

1:195–204Vaughan D. 2004. Theorizing disaster: analogy, historical ethnography, and the Challenger accident.

Ethnography 5:313–45Vaughan D. 2005a. Organizational rituals of risk and error. In Organizational Encounters with Risk, ed. B Hutter,

MK Power, pp. 33–66. Cambridge, UK: Cambridge Univ. PressVaughan D. 2005b. System effects: on slippery slopes, negative patterns, and learning by mistake. In Organi-

zations at the Limit: NASA and the Columbia Accident, ed. WH Starbuck, F Farjoun, pp. 41–59. Oxford:Blackwell

Vaughan D. 2006. NASA revisited: theory, analogy, and public sociology. Am. J. Sociol. 112:353–93Vogel D. 1981. The ‘new’ social regulation in comparative and historical perspective. In Regulation in Perspective,

ed. T McCraw, pp. 155–85. Cambridge, MA: Harvard Univ. PressVogel D. 1986. National Styles of Regulation: Environmental Policy in Great Britain and the United States. Ithaca,

NY: Cornell Univ. PressWalker JS. 2004. Three Mile Island: A Nuclear Disaster in Historical Perspective. Berkeley: Univ. Calif. PressWeick K. 1987. Organizational culture as a source of high reliability. Calif. Manag. Rev. 29:112–27Weick K, Roberts KH. 1993. Collective mind in organizations: heedful interrelating on flight decks. Admin.

Sci. Q. 38(3):357–81

368 Silbey

Ann

u. R

ev. S

ocio

l. 20

09.3

5:34

1-36

9. D

ownl

oade

d fro

m a

rjour

nals.

annu

alre

view

s.org

by M

ASS

ACH

USE

TTS

INST

ITU

TE O

F TE

CHN

OLO

GY

on

09/1

0/09

. For

per

sona

l use

onl

y.

Page 29: Taming Prometheus: Talk About Safety and Cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf · 2009-10-27 · arship about safety culture. Since the 1990s, identifying broken or

ANRV381-SO35-17 ARI 5 June 2009 9:28

Weick K, Sutcliffe K. 2001. Managing the Unexpected: Assuring High Performance in an Age of Complexity. SanFrancisco: Wiley

Weick K, Sutcliffe KM, Obstfeld D. 1999. Organizing for high reliability: processes of collective mindfulness.In Research in Organizational Behavior, ed. BM Staw, LL Cummings, 21:81–123. Greenwich, CT: JAI

Welke BY. 2001. Recasting American Liberty: Gender, Race, Law and the Railroad Revolution, 1865–1920.Cambridge, UK: Cambridge Univ. Press

Westrum R. 1997. Social factors in safety-critical systems. In Human Factors in Safety Critical Systems,ed. F Redmill, J Rajan, pp. 233–56. London: Butterworth-Heinemann

Witt JB. 2004. The Accidental Republic: Crippled Workingmen, Destitute Widows, and the Remaking of AmericanLaw. Cambridge, MA: Harvard Univ. Press

Zelizer V. 1979. Morals and Markets. The Development of Life Insurance in the United States. New Brunswick,NJ: Transaction Books

Zelizer V. 1997. The Social Meaning of Money: Pin Money, Paychecks, Poor Relief and Other Currencies. Princeton,NJ: Princeton Univ. Press

Zhang H, Wiegmann DA, von Thaden TL, Sharma G, Mitchell AA. 2002. Safety Culture: A Concept in Chaos?Santa Monica, CA: Hum. Factors Ergon. Soc.

Zohar D. 1980. Safety climate in industrial organizations: theoretical and applied implications. J. Appl. Psychol.65:96–102

www.annualreviews.org • Safety and Culture 369

Ann

u. R

ev. S

ocio

l. 20

09.3

5:34

1-36

9. D

ownl

oade

d fro

m a

rjour

nals.

annu

alre

view

s.org

by M

ASS

ACH

USE

TTS

INST

ITU

TE O

F TE

CHN

OLO

GY

on

09/1

0/09

. For

per

sona

l use

onl

y.

Page 30: Taming Prometheus: Talk About Safety and Cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf · 2009-10-27 · arship about safety culture. Since the 1990s, identifying broken or

AR348-FM ARI 2 June 2009 9:48

Annual Reviewof Sociology

Volume 35, 2009Contents

FrontispieceHerbert J. Gans ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! xiv

Prefatory Chapters

Working in Six Research Areas: A Multi-Field Sociological CareerHerbert J. Gans ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 1

Theory and Methods

Ethnicity, Race, and NationalismRogers Brubaker ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !21

Interdisciplinarity: A Critical AssessmentJerry A. Jacobs and Scott Frickel ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !43

Nonparametric Methods for Modeling Nonlinearityin Regression AnalysisRobert Andersen ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !67

Gender Ideology: Components, Predictors, and ConsequencesShannon N. Davis and Theodore N. Greenstein ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! !87

Genetics and Social InquiryJeremy Freese and Sara Shostak ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 107

Social Processes

Race Mixture: Boundary Crossing in Comparative PerspectiveEdward E. Telles and Christina A. Sue ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 129

The Sociology of Emotional LaborAmy S. Wharton ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 147

Societal Responses toTerrorist AttacksSeymour Spilerman and Guy Stecklov ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 167

Intergenerational Family Relations in Adulthood: Patterns, Variations,and Implications in the Contemporary United StatesTeresa Toguchi Swartz ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 191

v

Ann

u. R

ev. S

ocio

l. 20

09.3

5:34

1-36

9. D

ownl

oade

d fro

m a

rjour

nals.

annu

alre

view

s.org

by M

ASS

ACH

USE

TTS

INST

ITU

TE O

F TE

CHN

OLO

GY

on

09/1

0/09

. For

per

sona

l use

onl

y.

Page 31: Taming Prometheus: Talk About Safety and Cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf · 2009-10-27 · arship about safety culture. Since the 1990s, identifying broken or

AR348-FM ARI 2 June 2009 9:48

Institutions and Culture

Sociology of Sex WorkRonald Weitzer ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 213

The Sociology of War and the MilitaryMeyer Kestnbaum ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 235

Socioeconomic Attainments of Asian AmericansArthur Sakamoto, Kimberly A. Goyette, and ChangHwan Kim ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 255

Men, Masculinity, and Manhood ActsDouglas Schrock and Michael Schwalbe ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 277

Formal Organizations

American Trade Unions and Data Limitations: A New Agendafor Labor StudiesCaleb Southworth and Judith Stepan-Norris ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 297

Outsourcing and the Changing Nature of WorkAlison Davis-Blake and Joseph P. Broschak ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 321

Taming Prometheus: Talk About Safety and CultureSusan S. Silbey ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 341

Political and Economic Sociology

Paradoxes of China’s Economic BoomMartin King Whyte ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 371

Political Sociology and Social MovementsAndrew G. Walder ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 393

Differentiation and Stratification

New Directions in Life Course ResearchKarl Ulrich Mayer ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 413

Is America Fragmenting?Claude S. Fischer and Greggor Mattson ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 435

Switching Social Contexts: The Effects of Housing Mobility andSchool Choice Programs on Youth OutcomesStefanie DeLuca and Elizabeth Dayton ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 457

Income Inequality and Social DysfunctionRichard G. Wilkinson and Kate E. Pickett ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 493

Educational Assortative Marriage in Comparative PerspectiveHans-Peter Blossfeld ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 513

vi Contents

Ann

u. R

ev. S

ocio

l. 20

09.3

5:34

1-36

9. D

ownl

oade

d fro

m a

rjour

nals.

annu

alre

view

s.org

by M

ASS

ACH

USE

TTS

INST

ITU

TE O

F TE

CHN

OLO

GY

on

09/1

0/09

. For

per

sona

l use

onl

y.

Page 32: Taming Prometheus: Talk About Safety and Cultureweb.mit.edu/ssilbey/www/pdf/safety_culture.pdf · 2009-10-27 · arship about safety culture. Since the 1990s, identifying broken or

AR348-FM ARI 2 June 2009 9:48

Individual and Society

Nonhumans in Social InteractionKaren A. Cerulo ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 531

Demography

Social Class Differentials in Health and Mortality: Patterns andExplanations in Comparative PerspectiveIrma T. Elo ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 553

Policy

The Impacts of Wal-Mart: The Rise and Consequences of the World’sDominant RetailerGary Gereffi and Michelle Christian ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 573

Indexes

Cumulative Index of Contributing Authors, Volumes 26–35 ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 593

Cumulative Index of Chapter Titles, Volumes 26–35 ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! 597

Errata

An online log of corrections to Annual Review of Sociology articles may be found athttp://soc.annualreviews.org/errata.shtml

Contents vii

Ann

u. R

ev. S

ocio

l. 20

09.3

5:34

1-36

9. D

ownl

oade

d fro

m a

rjour

nals.

annu

alre

view

s.org

by M

ASS

ACH

USE

TTS

INST

ITU

TE O

F TE

CHN

OLO

GY

on

09/1

0/09

. For

per

sona

l use

onl

y.