Society and Technological Risks
SOCIETY AND TECHNOLOGICAL RISKS
If there is an organizing theme in sociology, it is social order: what it looks like, how to think about the various forms it takes, and how to explain it. Conversely, what happens when social order breaks down? What changes are wrought in how people see the world, and most important, what is altered in how they relate to one another when social order goes awry? The study of risk, danger, and catastrophe is a special case of the larger field of social breakdown.
Sociologists have long been interested in phenomena that harm people and what people value. Until recently, most of this work concentrated on harm from natural events such as earthquakes, floods, and tornadoes, but many researchers now write about "technical" or "technological" risks. In some ways the distinction between natural and technological risks or disasters is not helpful: There is no objective difference between a death caused by a fire and a death caused by an airplane crash. Yet in other ways those who have been fascinated by how modern technologies fail people have asked a broader set of questions than they could have if they did not see a difference between natural and technological risks. They have asked new questions about the functions of expertise and science in modern society, the roles of power and authority in the creation of danger, and the capacity of people to build systems they cannot control.
In this encyclopedia of sociology, risk and danger are treated mainly as a sociological problem, but this is not necessarily the case. Scholars writing about these issues come from economics, geography, psychology (Mellers et al. 1998), anthropology (Oliver-Smith 1996), and even engineering (Starr 1995) and physics. This is basically a good thing: Too much sociology is self-referential and inbred, and truly interdisciplinary work creates considerable intellectual, if not professional, excitement. No one can write about technological risks in an interesting way without reading and thinking in interdisciplinary terms.
Scholars concerned with technological risks have addressed a wide variety of topics that range from how individuals think about risks to how nation-states develop strategies to mitigate threats from failures of high technology. Some scholars even write about risks that might be faced by societies far in the future. Toxic threats have drawn particularly close scrutiny from scholars, and there are important sociological studies of Love Canal, Three Mile Island, Chernobyl, Bhopal, the Challenger, nuclear waste, and nuclear weapons. One reason for this is that toxic risks invert the way natural disasters do damage. Rather than assaulting people from the outside, as do other calamities, toxic hazards assault bodies from within. Toxic injuries also have no definable end, and so their victims can never know when they are safe from further damage. The point here is that the meaning of toxic threats is fundamentally different from that of natural disasters (Couch and Kroll-Smith 1985; Erikson 1990, 1994). The disruption in social order thus can be largely internal, with psychological and emotional suffering caused by the breakdown of external social systems (Sorokin 1968).
In general, the sociology of risk is concerned with researching and explaining how interactions between technology and modes of social organization create hazards or the potential for hazards (Clarke and Short 1993). A hazard can be an actual threat to people's lives (toxic chemical contamination, for example) or the perception that there is a threat. Indeed, many analysts focus on risk perception: what people think is dangerous and why they think what they do (Freudenburg 1988). The word "technology" refers to the social and mechanical tools people use to accomplish something, such as the design of a nuclear power plant and the vocabularies used by experts when they talk about effectively evacuating an urban area after a major radiation release from a nuclear power plant. "Modes of social organization" refers to both social structure (e.g., hierarchies of power) and culture (e.g., the degree of legitimacy granted to experts). In the twenty-first century society will continue its march toward social and technical complexity. One expression of this complexity is a capacity to create machines and institutional arrangements that are at once grand and terrifying. With these developments, it seems, publics are increasingly aware of the potentially devastating consequences of system failures even as they enjoy the cornucopia engendered by modern social organization and modern technology.
This is an opportune place to identify an area of research that will be increasingly important for both intellectual and policy reasons. A lot of work in psychology and economics, which echoes the concerns of political and economic elites, concerns public perception of risk. Much of that work has shown that the general public does not make decisions in accordance with a hyperrational calculus in which its preferences and values are always consistent and, more to the point, agree with those of trained scientific researchers. Consonant with the concern with public irrationality is the notion that people panic when faced with risks they do not understand. It is easy to find this idea in media reports of high-level politicians' remarks after a large accident: Politicians worry that people will overreact in very unproductive ways. The image is one of people trampling each other to make for the exits of a burning building or escape from a sniper's random rifle shots. Translated to perception of risk regarding accidents and disasters, the image becomes one of individuals pursuing their self-interest to the exclusion of those of their neighbors and communities: to get out of Love Canal, to run away from Three Mile Island, or to flee a burning airplane that has barely managed to land.
In fact, research indicates that people rarely panic even when it might be rational to do so. I have reviewed scores of cases of natural and technological disasters—trains fall over and release toxic chemicals that endanger a town, earthquakes shake a city to its core, fires threaten to level an entire neighborhood—and have found very few instances of uncontrolled flight at the expense of others. After the Chernobyl catastrophe in 1986 there was some panic, though that response might have been highly sensible. The U.S. firebombing of Tokyo in World War II also elicited some cases of panic. With exceptions of that sort, it is hard to find widespread panic after any type of disaster. Even events such as the fire at the Beverly Hills Supper Club and the stampede at the Who concert, which are commonly thought of as examples of panic, were not ( Johnson 1987). Rather than panic, the modal reaction is one of terror, followed by stunned reflection or sometimes anomie and ending with a fairly orderly response (e.g., reconstruction or evacuation). Even in the horrors chronicled by the U.S. Strategic Bombing Survey, cities burn, bodies explode, houses fall down, and still people do not panic ( Janis 1951; Hersey 1985).
One way to classify research on risk is in terms of micro and macro perspectives. Both micro and macro studies have made important contributions to an understanding of the connections between risk, technology, and society. Micro-level research, generally speaking, is concerned with the personal, political, and social dilemmas posed by technology and activities that threaten the quality of people's lives. Macro-level work on risk does not deny the importance of micro-oriented research but asks different questions and seeks answers to those questions at an institutional level of analysis.
As some of the examples below illustrate, much macro work emphasizes the importance of the institutional context within which decisions about risk are made. Sociologists of risk are keen to distinguish between public and private decisions. Some people make choices that affect mainly themselves, while those in positions of authority make choices that have important implications for others. This is only one among many ways in which the sociology of risk is concerned with issues of power and the distribution of hazards and benefits.
THE MICRO LEVEL
As noted above, a substantial body of work has demonstrated that the public overestimates threats that are dramatic (e.g., from airplane accidents), particularly violent (e.g., from handguns), and potentially catastrophic (e.g., from nuclear power plants). Similarly, people tend to underestimate more prosaic chronic threats such as those from botulism and asthma. Several explanations for this phenomenon have been proposed; the one that is most convincing focuses on the mechanisms through which information about risks is channeled to people (Kahneman et al. 1982). Specifically, the media—especially newspapers and television—are more likely to feature dramatic, violent, or catastrophic calamities than less sensational threats. One reason the media find such risks more interesting is that they are easier to cover and hence more easily fit into tight deadlines. Covering prosaic risks is also more time-consuming than covering short, dramatic accidents. Thus, there are several good structural reasons why the media pay attention to high-drama risks and neglect low-drama risks. Scholars are able to explain why the public has biased estimates of risk by focusing on the structural connections between people and the media, specifically on the constraints that lead the media to be biased about certain types of information.
Another example at the micro level of analysis is found in the work of Heimer (1988, 1992; Heimer and Staffen 1998), who analyzed how information is used and transmitted in intensive care units for infants. Her study is cast at a micro level of analysis in the sense that one of her concerns is how parents think about information regarding terribly sick babies. However, like all good sociological studies, Heimer's connects what parents think with the social contexts in which they find themselves. For example, one of her findings is that when hospital personnel transmit information to parents about their infants, that process is structured to protect the hospital from lawsuits and only secondarily to apprise parents of the condition of their children. Thus, Heimer describes, much as a psychologist might, how parents think but also demonstrates that how they think is contingent on the organizational needs of hospitals.
THE MACRO LEVEL
Macro-level work on risk includes research on how professionals influence the behavior of regulatory agencies, how organizations blunder and break down (Vaughan 1999), how social movements arise to push issues into the public debate, and how national cultures influence which risks are considered acceptable (Douglas 1985). Many macro theorists are deeply concerned with how the institutional structure of society makes some risks more likely than others to gain political and intellectual attention (Clarke 1988). Consider motor vehicle risks. Nearly 40,000 people are killed on U.S. highways every year, and most people would agree that that is an appalling mortality rate. Although it is a commonplace that half those deaths are alcohol-related, it is not really known how much of the carnage is in fact due to alcohol (Gusfield 1981). Nevertheless, one probably can reasonably assume that a significant proportion is caused by drunk drivers (even 10 percent would be 4,000 people). Most people would agree that 4,000 deaths per year is grounds for concern, yet it also is known that high rates of fatal traffic accidents are associated with speeding, wrong turns, and improper passing. Objectively, there is no difference between a death caused by someone making a wrong turn and one caused by a drunk driver, yet most cultures say the two deaths are very different. In the United States there is even a small social movement galvanized around the issue of drunk drivers, an example of which is the organization called Mothers Against Drunk Driving. Why is there no organization called Mothers Against Improper Passers? A sociological answer is that most cultures frown on using drugs to alter one's degree of self-control and that a person who does so is defined as morally decadent and lacking social responsibility. Thus the opprobrium unleashed on drunken drivers has less to do with the objective magnitude of the problem than with the apparent danger such drivers represent to the cultural value of self-control.
Another example of macro work on risk, this time concerning organizations and symbols, is the oil spill from the Exxon-Valdez tanker in March 1989. At the time, the Exxon spill was the worst ever to occur in U.S. waters, leaking at least eleven million gallons into Prince William Sound and the Gulf of Alaska. The spill caused massive loss of wildlife, and although no people died, it did disrupt social relationships, create a political crisis, and reorient debates about the safety of oil transportation systems in the United States. From a sociological point of view, one of the most interesting things about the spill is how corporations and regulatory agencies plan for large oil spills. Sound research shows that not much can be done about large amounts of spilled oil (Clarke 1990), yet organizations continue to create elaborate plans for what they will do to contain large spills and how they will clean the oil from beaches and shorelines. They do this even though there has never been a case of successful containment or recovery on the open seas. Organizations create plans that will never work because such plans are master metaphors for taming the wild, subjugating uncertainty, and proclaiming expertise. In modern societies, expert knowledge and rational organization are of paramount importance, and there now seems to be an institutionalized incapacity to admit that some things may be beyond people's control (Clarke 1999).
Another example is the 1984 tragedy in Bhopal, India. At least 2,600 people died when a complex accident in a Union Carbide plant released toxic chemicals into the environment. At the time, Bhopal was the worst single industrial accident in history (the nuclear meltdown at Chernobyl in 1986 eventually will lead to more deaths). The Bhopal tragedy was certainly an organizational failure, as studies have documented (Shrivastava 1987). However, what was most interesting about the Bhopal accident was that the risk created by the Union Carbide chemical plant had become institutionalized to the point where very few, if any, of the key players, were worried about a potential catastrophe. The poor people who lived next to the plant seemed to have accepted official assurances that they were safe and in any case had little choice in the matter. Government officials—the ones assuring those who lived near the plant of their safety—seemed to have accepted the disastrous potential of the plant as part of the price of having a large corporation in their country. For their part, corporate officials and experts seemed to have given insufficient thought to the possibility of killing several thousand Indians. One reason the Bhopal disaster is sociologically interesting is the degree to which groups and organizations come to accept risk as part of their everyday lives. The same observation might be made of automobile driving, nuclear power plants, and lead-contaminated water pipes (even brass pipes contain about 7 percent lead).
Anyone reading about how social organization and technology can break down must wonder whether it must always be so. Some people believe that it must, and although those scholars may be wrong (Clarke 1993), they cannot be ignored. High-reliability organizations (HROs), as they are called, are said to be possible. These organizations are alleged to be so safe that they are error-free. The claim is not that these organizations cannot fail but that they never do so in an important way (Roberts 1993). Somehow these organizations (a U.S. nuclear aircraft carrier is a good example) are able to maintain a strict hierarchy while enabling people low in the hierarchy to intervene in the functioning of the organization to prevent failure (contradicting how most organizations work). Moreover, rather than cover up their mistakes as most organizations do, HROs try hard to learn from theirs. Finally, these organizations have a lot of redundancy built into them, preventing small errors from escalating into complete system failure.
High-reliability theory is animated by a disagreement with what is called normal accident theory (NAT) (Perrow 1984). NAT views the complexity of high-technology systems as problematic. The components in complex systems, say, a nuclear power plant, can fail in ways that no one could have anticipated and that no one understands when a catastrophe unfolds. From this view, rather than safety, redundancies can add technical complexity and lead to the formation of political interest groups, both of which can interfere with safe operations (Sagan 1993). Rather than honest learning, NAT stresses that managers and experts often engage in symbolic representations of safety, trying to convince the public, regulators, and other organizations that they are on top of potential problems (Clarke and Perrow 1996). An important contribution of NAT lies in locating the source of risk in organizations per se. Its structural emphasis draws attention away from easy and familiar explanations such as human error.
The contrast between NAT and HRO theory goes beyond their assessments of the inevitability of organizational failure, for the two schools of thought exemplify the concern with social order and disorder that I mentioned above. High-reliability theory is optimistic about human perfectibility, highlighting society's tendency to create and maintain order; normal accident theory is pessimistic about human perfectibility, fundamentally viewing failure and disorder as inherent in the human condition.
An important work that emphasizes the imperfections of organization is Vaughan's book on the Challenger accident (1996). Vaughan argues that what looked like a highly risky decision—to send the Challenger up that day—was in fact normal given the routines and expectations that organized the thoughts of the officials and experts involved in that choice. An important reason for this "normalization of danger" was the high production pressures that the decision makers faced. It was not a matter of people deliberately taking chances they knew were unreasonable because of clear, external, imposing pressures. It was a more subtle process by which the very definition of "reasonable" shifted in a way that did not contravene those pressures. Vaughan's view stresses the commonality of error in all organizations. Some organizations make computer chips, some make space flights, but all fail. Vaughan details the mechanisms that produced a certain worldview of danger and safety that prevailed at NASA, and in so doing she connects micro and macro, structure and culture.
Every year the natural environment seems to be more polluted than it was the preceding year. Why? A commonsense explanation might claim that people do not care enough about the environment, perhaps attributing callous attitudes and personal greed to politicians and corporations. Such an explanation would focus on the motives of individual managers and politicians but from a sociological point of view would miss the all-important institutions in which such decision makers function. Sociologists know that those who occupy top positions in government and corporate organizations are not without intelligence and good sense. These decision makers may be individually quite concerned about environmental degradation, but because of their structural locations, they are subject to pressures that may be at odds with environmental health and welfare. These pressures originate in specific social structures that create institutional interests that may be contrary to individual preferences (Clarke 1988). A corporate executive seeks (and must seek) to remain in business, if necessary at the expense of others' well-being or the environment. A similar explanation accounts for why Ford's president, Lee Iacocca, in the 1970s marketed Pintos that had propensity to explode and burn. In other words, market institutions are arranged so that it is sensible for any individual or organization to force negative externalities on society. For its part, one of the key functions of government is to maintain a political and economic environment that is favorable to business. Thus, an explanation that centers on the institutional constraints and incentives that shape decisions about pollution can account for the behavior of organizations, experts, and officials far better than can an explanation that focuses on their personal characteristics.
FUTURE DIRECTIONS
Future developments in the sociology of risk probably will revolve around issues of social conflict: its bases, meaning, and role in spurring social change. Society—and sociology—will be confronted with fundamental dilemmas in the twenty-first century. Society will have to deal with issues of environmental justice and the likelihood that pollution and risk are unequally distributed. Modernity brings both fruits and poisons. In particular, people must come to grips with what may the primary dilemma of modern times: How can an industrial and democratic system that yields such a high standard of living also be involved in the creation of terrible hazards? Answering this question will require a recognition that many of the most frightening threats—nuclear meltdowns near large cities, toxic leachate in water tables, ozone destruction, explosions of liquefied natural gas from supertankers, failure to contain nuclear waste—almost seem beyond control. It may be the case that before society can better control political and technological systems, people must admit that some aspects of the technical world are not within human control.
references
Clarke, Lee 1988 "Explaining Choices among Technological Risks." Social Problems 35(1):501–514.
—— 1990. "Oil Spill Fantasies." Atlantic Monthly, November, pp. 65–77.
—— 1993 "Drs. Pangloss and Strangelove Meet Organizational Theory: High Reliability Organizations and Nuclear Weapons Accidents." Sociological Forum 8(4):675–689.
—— 1999 Mission Improbable: Using Fantasy Documents to Tame Disaster. Chicago: University of Chicago Press.
——, and Charles Perrow 1996 "Prosaic Organizational Failure." American Behavioral Scientist 39 (8):1040–1056.
——, and James F. Short, Jr. 1993 "Social Organization and Risk: Some Current Controversies." Annual Review of Sociology 19:375–399.
Couch, Stephen R., and J. Stephen Kroll-Smith 1985 "The Chronic Technical Disaster." Social Science Quarterly 66(3):564–575.
Douglas, Mary 1985 Risk Acceptability According to the Social Sciences. New York: Russell Sage Foundation.
Erikson, Kai 1990 "Toxic Reckoning: Business Faces a New Kind of Fear." Harvard Business Review 90(1):118–126.
—— 1994 A New Species of Trouble: Explorations in Disaster, Trauma, and Community. New York: Norton.
Freudenburg, William R. 1988 "Perceived Risk, Real Risk: Social Science and the Art of Probabilistic Risk Assessment." Science. 242:44–49.
Gusfield, Joseph 1981 The Culture of Public Problems: Drinking-Driving and the Symbolic Order. Chicago: University of Chicago Press.
Heimer, Carol A. 1988 "Social Structure, Psychology, and the Estimation of Risk." Annual Review of Sociology 14:491–519.
—— 1992 "Your Baby's Fine, Just Fine: Certification Procedures, Meetings, and the Supply of Information in Neonatal Intensive Case Units." In James F. Short, Jr., and Lee Clarke, eds., Organizations, Uncertainties, and Risk.
——, and Lisa R. Staffen 1998 For the Sake of the Children: The Social Organization of Responsibility in the Hospital and the Home. Chicago: University of Chicago Press.
Hersey, John 1985 Hiroshima. New York: Knopf.
Janis, Irving Lester 1951 Air War And Emotional Stress. New York: McGraw-Hill.
Johnson, Norris R. 1987 "Panic and the Breakdown of Social Order: Popular Myth, Social Theory, Empirical Evidence." Sociological Focus 20(3):171–183.
Kahneman, Daniel, Paul Slovic, and Amos Tversky 1982 Judgment under Uncertainty: Heuristics and Biases. Cambridge: Cambridge University Press.
Mellers, B. A., A. Schwartz, and A. D. J. Cooke 1998 "Judgment and Decision Making." Annual Review of Psychology 49:447–477.
Oliver-Smith, Anthony 1996 "Anthropological Perspectives on Hazards and Disasters." Annual Review of Anthropology 25:303–328.
Perrow, Charles 1984 Normal Accidents: Living with High Risk Technologies. New York: Basic Books.
Roberts, Karlene H., ed. 1993 New Challenges to Understanding Organizations. New York: Macmillan.
Sagan, Scott 1993 The Limits of Safety: Organizations, Accidents, and Nuclear Weapons. Princeton, N.J.: Princeton University Press.
Shrivastava, Paul 1987 Bhopal: Anatomy of a Crisis. Cambridge: Ballinger.
Sorokin, Pitirim A. 1968 [1942] Man and Society in Calamity. New York: Greenwood Press.
Starr, Chauncey 1995 "A Personal History: Technology to Energy Strategy." Annual Review of Energy and the Environment 20:31–44.
Vaughan, Diane 1996 The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA. Chicago: University of Chicago Press.
—— 1999 "The Dark Side of Organizations." Annual Review of Sociology 25:271–305
Lee Clarke