Normal Accidents: Living with High-Risk Technologies: Difference between revisions

no edit summary
No edit summary
No edit summary
Line 25: Line 25:
===When Kramer hears about this ...===
===When Kramer hears about this ...===
[[File:Shit hits fan.jpg|300px|thumb|right|Kramer hearing about this, yesterday.]]
[[File:Shit hits fan.jpg|300px|thumb|right|Kramer hearing about this, yesterday.]]
So far, so hoopy; but here’s the rub: we can make our systems less complex and, to an extent, reduce [[tight coupling]] by careful design and iterative improvement:<ref>Air transport has become progressively less complex as it has developed. It has learned from each accident.</ref> But it is axiomatic that we can’t eliminate complexity altogether.  And we tend ''not'' to simplify: to the contrary, we like to add prepackaged “risk mitigation” components: [[Policy|policies]], processes, rules, and [[Chatbot|new-fangled bits of kit]] to the process. These give our [[middle management]] layer comfort; they allow them to set their [[RAG status]]es green, and may justify eviscerating that expensive cohort of [[subject matter expert]]s who will turn out to be just the people you need when Kramer hears about this.
So far, so hoopy; but here’s the rub: we can make our systems less complex and ''reduce'' [[tight coupling]] by careful design, functional redundancy and iterative improvement,<ref>Air transport has become progressively less complex as it has developed. It has learned from each accident.</ref> but, as long as it is a complex system with the scope for complex interaction, ''we cannot eliminate [[system accident]]s altogether''. They are, as coders like to joke, a feature, not a bug.  


Here is where the folly of [[complicated]] safety mechanisms comes in: adding linear safety systems to a system ''increases'' its complexity, and makes dealing with systems failures, when they occur, even harder. Not only do linear safety mechanisms exacerbate or even create their own accidents, but they also afford a degree of false comfort that encourages managers, who typically have financial targets to meet, not safety ones — to run the system harder, thus increasing the tightness of the coupling between unrelated components. That same Triple A rating that lets your risk officer catch some zeds at the switch encourages your trader to double down. ''I’m covered. What could go wrong?''  
Furthermore, in our efforts to pre-solve for catastophe, we tend ''not'' to simplify: to the contrary, we add prepackaged “risk mitigation” components: [[Policy|policies]], [[taxonomy|taxonomies]], [[key performance indicator]]s, [[tick-boxes]], processes, rules, and [[Chatbot|new-fangled bits of kit]] to the process in the name of risk management. To be sure, these give our [[middle management]] layer comfort; they can set their [[RAG status]]es green, and it may justify their planned evisceration of that cohort of troublesome [[subject matter expert]]s who tend to foul up the mechanics of the [[Heath Robinson machine]] — but who will turn out to be just the people you wish you hadn’t fired when the shit hits the fan.
 
Here is the folly of elaborate, [[complicated]] safety mechanisms: adding components to any complex system ''increases'' its complexity. That, in itself, makes dealing with [[system accident]]s, when they occur, ''harder''. The safety mechanisms beloved of the [[middle management]] layer derive from experience. They secure stables from which horses have bolted. They are, as Jason Fried elegantly put it, “organisational scar tissue. Codified responses to situations that are unlikely to happen again.”<ref>{{br|Rework}}, {{author|Jason Fried}}</ref>
They are, in a word, ''linear'' responses to what is by definition a ''non-linear'' problem.
 
Not only do linear safety mechanisms exacerbate or even create their own accidents, but they also afford a degree of false comfort that encourages managers, who typically have financial targets to meet, not safety ones — to run the system harder, thus increasing the tightness of the coupling between unrelated components. That same Triple A rating that lets your risk officer catch some zeds at the switch encourages your trader to double down. ''I’m covered. What could go wrong?''  


Part of the voyeuristic pleasure of Perrow’s book is the salacious detail with which he documents the sequential failures at Three Mile Island, the Space Shuttle ''Challenger'', Air New Zealand’s Erebus crash, among many other disasters and near-misses. The chapter on maritime collisions would be positively hilarious were it not so distressing.
Part of the voyeuristic pleasure of Perrow’s book is the salacious detail with which he documents the sequential failures at Three Mile Island, the Space Shuttle ''Challenger'', Air New Zealand’s Erebus crash, among many other disasters and near-misses. The chapter on maritime collisions would be positively hilarious were it not so distressing.