Normal Accidents
NORMAL ACCIDENTS
The concept of normal accidents was formulated by sociologist Charles Perrow in Normal Accidents: Living with High Risk Technologies (1984), but is related to a number of other analyses of accidents in complex, technological societies. Perrow used the concept to describe a type of accident that inevitably results from the design of complex mechanical, electronic, or social systems. The theory has had extended influence on subsequent analyses of accidents and errors related especially to advanced technologies.
Perrow's Normal Accidents
The unexpected and interactive failure of two or more components is not sufficient to cause a normal accident when there is enough time to solve the problem before it becomes critical. Instead normal accidents in Perrow's sense occur only in systems that, in addition to being complexly interactive, are also tightly coupled. One example would be two components whose failures start a fire while silencing the fire alarm. Intervention by system operators in the early minutes or hours of such an incident often makes things worse, as when manual fire alarm activation might open doors that allow the fire to spread.
Perrow believes that normal accidents are an inevitable consequence of human reliance on complex and tightly coupled systems. By confronting the causes of normal accidents, the designers, users, and potential victims—in fact, society as a whole—can make appropriate practical and ethical decisions about the systems involved. Once one understands why normal accidents occur, and also the fact that they are almost inevitable in complex systems, Perrow suggests that "we are in a better position to argue that certain technologies should be abandoned, and others, which we cannot abandon because we have built much of our society around them, should be modified" (Perrow 1984, p. 4).
In Normal Accidents, Perrow provides several examples to flesh out his argument. One case involves the loss of the two square mile Lake Peigneur in Louisiana. The lake was in simultaneous use by shipping companies (via a canal connected to the Gulf of Mexico), fishermen, tourists (the Rip van Winkle Live Oak Gardens was on its banks), and oil companies (Texaco was drilling for oil in a part of the lake only three to six feet deep). Under the lake was a salt mine operated by the Diamond Crystal Company. Texaco's oil rig penetrated the mine and vanished from sight, after which all of Lake Peigneur drained into the mine, creating a whirlpool that pulled in several barges, a tug, and sixty-five acres of the Rip van Winkle Gardens. The canal to the Gulf reversed course, creating a 150-foot waterfall as the lake drained away. An underground natural gas well ruptured and bubbles floated to the surface, caught fire and burned. In just seven hours, Lake Peigneur was gone—without, however, taking a single life.
The accident was caused by the fact that the lake, oil rig, and mine were complexly interactive and tightly coupled. Subsystem operators understood none of the relationships and did not communicate adequately with one another. The Peigneur Lake incident illustrates another of Perrow's points, about the social allocation of responsibility. Instead of analyzing the system as a whole with an eye to reducing complexity or ameliorating the tight coupling, each of the players held the others responsible, Texaco accusing Diamond Crystal and vice versa. In analyzing the near-meltdown at the Three Mile Island nuclear plant in 1979, Perrow noted that the equipment vendor and the system operators blamed each other. Systems of adversarial litigation can in such cases militate against the solving of system problems.
Another phenomenon analyzed by Perrow is that of non-collision course collisions, in which ships on parallel, opposite courses suddenly turn and hit one another at the last moment. Perrow tells the story of the Coast Guard cutter Cuyahoga, operating at night. Although lookouts correctly interpreted the three lights visible on the Santa Cruz II to mean the ship was headed toward them on a parallel course, they did not inform their captain, because they knew he was aware of the other ship. What they did not realize was that the myopic captain had noted only two lights on the Santa Cruz II, interpreting these to mean that it was a smaller fishing vessel, sailing ahead of the Cuyahoga and in the same direction. As the Cuyahoga came closer to the freighter, the captain turned to port, to pass outside the other ship. In reality, since the Santa Cruz II was headed toward him, he turned out of a parallel course, which would have passed the Santa Cruz II without incident, right into its track, causing a collision with the loss of eleven lives.
Perrow argued that in the relatively brief moments available, operators who must function rapidly in real time construct a simplified view of the environment based on available, often incomplete, information. Once this has been accomplished, all contradictory information is excluded. A related problem is the extremely authoritarian command structure used at sea; in which first mates are much less comfortable questioning their captains than copilots are in the air. Such non-collision course collisions are common and, according to Perrow, constituted a majority of the cases he studied in which ships hit other ships.
Perrow noted the differences in social factors between air and sea travel and transport that promote the much larger percentage of accidents at sea. The differing factors include levels of government regulation, pressure to meet schedules, communication between captains and crew, and social status of air versus sea travelers. He concluded that much of the technology developed to make aviation safer, such as traffic control systems, is not used at sea, though it could be.
Perrow also analyzed cases in which safety devices encourage people to engage in more risky behaviors. For example, the installation of new braking systems in trucks, decreasing the possibility of failure on mountain roads, has not resulted in a decline in the number of accidents. Truck drivers who believe they have safer brakes will drive faster because it can save time and money. Similarly in some industries such as marine transport, insurance may make owners complacent, as the real cost of upgrading ships to prevent loss may exceed that of replacing them. Studies of these and related phenomena of automobile accidents (and even business and financial management) have resulted in development of the concept of risk homeostasis, in which increases in safety tend to be complemented by changes in behavior that once again increase risk to a certain acceptable level (Wilde 2001, Degeorge, Moselle, and Zeckhauser 2004).
Perrow's analysis is largely confirmed by high profile systems accidents that have occurred since the book was published. The loss of the Challenger (complex interaction, tight coupling between the fragile o-rings and the explosion potential of the fuel tanks) and the Columbia (complexity plus coupling between the disposable tanks, off which ice or insulation might fall, and the fragile tiles on the wings which could be damaged by them) space shuttles are two cases in point. In both instances the communications failures highlighted by Perrow are visible (the engineers on the Challenger knew that o-rings fail at freezing temperatures, but could not get their managers to postpone the launch; those on the Columbia launch wanted to get military spy satellite photos of the wing tiles but could not get their supervisors to agree). A blackout in the eastern and central United States and part of Canada in August 2003 is another example: The highly interdependent utilities failed to function as part of one system, the malfunctioning problem-detection software failed to warn of the overload in one provider, resulting in cascading failures of turbines, and the providers and utilities failed to warn others of known problems.
Competing Analyses
Since Perrow's work a number of studies have both criticized and extended his arguments. Among the most influential are Scott Sagan's The Limits of Safety (1993) and Dietrich Dörner's The Logic of Failure (1996). Sagan examines two competing theories on safety, normal accident and high reliability, for their ability to explain historical experiences in the control of nuclear weapons. In opposition to normal accident theory, high reliability theory posits that systems can be made safe by employing redundancy measures, decentralizing authority so that those nearest a problem can make quick decisions, and rigorously disciplining operators. It is an optimistic belief that well-managed and designed organizations can be perfectly safe.
Sagan shows that nations such as the United States and Russia use high reliability theory to manage their nuclear weapons. He then provides several examples of accidents and near-accidents that challenge the central assumption of this theory, namely, that nuclear systems can be made safe. Sagan argues that the normal accident theory better explains nuclear weapons systems, which are so complex and tightly coupled that accidents, although rare, are inevitable. He points to such limitations on high reliability theory as conflicting goals and priorities, constraints on learning, limitations on leaders' ability to control the human and technical components of the system, and pressure to turn memories of failures into successes. Sagan concludes that more outside reviews and information sharing, changes in organizational cultures (including less faith in redundancy), complete nuclear disarmament, and decoupling interactions are all alternatives to increase the safety of nuclear weapons systems. None of these alternatives, however, is very likely to occur.
Dörner claims that our main shortcomings when faced with complex problems are a tendency to oversimplify and a failure to conceive of a problem within its system of interacting factors. Failure does not necessarily result from incompetence. For example, the operators of the Chernobyl nuclear reactor were experts, and in fact ignored safety standards precisely because they felt that they knew what they were doing.
Dörner identifies four habits of mind that account for the difficulty in solving complex problems: (a) slowness of thinking; (b) a desire to feel confident and competent; (c) an inability to absorb and retain large amounts of information; and (d) a tendency to focus on immediately pressing problems and to ignore the problems that solutions are likely to create. Dörner's work highlights the area of normal accident theory dealing with cognitive and psychological factors (i.e., human error) in accidents.
In line with Dörner's analysis, Keith Hendy's Systematic Error and Risk Analysis (SERA) software tool investigates, classifies, and tracks human error in accidents. It employs a five-step process that guides investigators through a series of questions and decision ladders in order to determine where errors occurred (Defence Research and Development Canada 2004).
Perrow's initial work has thus sparked continuing analyses of complex technological systems and the causes of their failures, so that debates about the risks and benefits of technology are regularly influenced by normal accident theory. The results of such debates are nevertheless mixed. In fact, Perrow maintained that some systems, like nuclear power, should be abandoned, while others, like marine transport, require significant modification, but can be made reasonably safe.
Perrow's book, though presented as a narrow study of the functioning of technological systems, is also a study of the psychology of human error, which could be fatal even in low-tech systems, and is much more dangerous today given the speed, size, and clout of modern technology. Perrow's work deserves continuing recognition because he was arguably the first to introduce the concept that accidents, rather than being a lightning bolt from the blue, are inherent in the nature of complex systems, and that human provisions to avoid the consequences may actually engender more danger.
JONATHAN WALLACE ADAM BRIGGLE
SEE ALSO Unintended Consequences.
BIBLIOGRAPHY
Degeorge, François; Boaz Moselle; and Richard J. Zeckhauser. (2004). "The Ecology of Risk Taking." Journal of Risk and Uncertainty 28(3): 195–215.
Dörner, Dietrich. (1996). The Logic of Failure: Recognizing and Avoiding Error in Complex Situations. New York: Perseus Books Group.
Perrow, Charles. (1984). Normal Accidents: Living with High Risk Technologies. New York: Basic Books. Reprinted, Princeton, NJ: Princeton University Press, 1999. The leading analysis of the primary role of human error in high technology accidents.
Sagan, Scott. (1993). The Limits of Safety: Organizations, Accidents, and Nuclear Weapons. Princeton, NJ: Princeton University Press.
Wilde, Gerald J. S. (2001). Target Risk 2: A New Psychology of Safety and Health. Toronto: PDE Publications. Revised and expanded edition of a book first published in 1994.
INTERNET RESOURCE
Defence Research and Development Canada. "Using Human Logic and Technical Power to Explain Accidents." Available from http://www.drdc-rddc.dnd.ca/newsevents/spotlight/0402_e.asp.