System accident  

From The Art and Popular Culture Encyclopedia

Jump to: navigation, search

Related e

Wikipedia
Wiktionary
Shop


Featured:

A system accident, or normal accident, is an "unanticipated interaction of multiple failures" in a complex system. This complexity can be technological, organizational, or both. A system accident can be easy to see in hindsight, but difficult in foresight. Ahead of time, there are simply too many possible action pathways.

These accidents often resemble Rube Goldberg devices in the way that small errors of judgment, flaws in technology, and insignificant damages combine to form an emergent disaster. System accidents were described in 1984 by Charles Perrow, who termed them "normal accidents", as having such characteristics as interactive complexity, tight coupling, cascading failures, and opaqueness. James T. Reason extended this approach with human reliability and the Swiss cheese model, now widely accepted in aviation safety and healthcare.

Once an enterprise passes a certain point in size, with many employees, specialization, backup systems, double-checking, detailed manuals, and formal communication, employees can all too easily recourse to protocol, habit, and "being right." Rather like attempting to watch a complicated movie in a language one is unfamiliar with, the narrative thread of what is going on can be lost. And other phenomena such as groupthink can be occurring at the same time, for real world accidents almost always have multiple causes. In particular, it is a mark of a dysfunctional organization to simply blame the last person who touched something.

In 2012 Charles Perrow wrote, "A normal accident is where everyone tries very hard to play safe, but unexpected interaction of two or more failures (because of interactive complexity), causes a cascade of failures (because of tight coupling)."

There is an aspect of an animal devouring its own tail, in that more formality and effort to get it exactly right can actually make the situation worse. For example, the more organizational rigmarole involved in adjusting to changing conditions, the more employees will delay in reporting the changing conditions. The more emphasis on formality, the less likely employees and managers will engage in real communication. And new rules can make the situation worse, both by adding another layer of complexity and by telling employees, yet again, that they are not to think but merely to follow the rules.

In a 1999 article primarily focusing on health care, J. Daniel Beckham wrote, "It is ironic how often tightly coupled devices designed to provide safety are themselves the causes of disasters. Studies of the early warning systems set up to signal missile attacks on North America found that the failure of the safety devices themselves caused the most serious danger: false indicators of an attack that could have easily triggered a retaliation. Accidents at both Chernobyl and Three Mile Island were set off by failed safety systems."




Unless indicated otherwise, the text in this article is either based on Wikipedia article "System accident" or another language Wikipedia page thereof used under the terms of the GNU Free Documentation License; or on research by Jahsonic and friends. See Art and Popular Culture's copyright notice.

Personal tools