Normal Accidents: Living with High-Risk Technologies: Difference between revisions

Jump to navigation Jump to search
no edit summary
No edit summary
No edit summary
Line 16: Line 16:
:''New financial instruments such as derivatives and hedge funds and new techniques such as programmed trading further increase the complexity of interactions. Breaking up a loan on a home into tiny packages and selling them on a world-wide basis increases interdependency.''<ref>{{br|Normal Accidents}} p. 385. This in 1999, for Pete’s sake</ref>  
:''New financial instruments such as derivatives and hedge funds and new techniques such as programmed trading further increase the complexity of interactions. Breaking up a loan on a home into tiny packages and selling them on a world-wide basis increases interdependency.''<ref>{{br|Normal Accidents}} p. 385. This in 1999, for Pete’s sake</ref>  


So, financial services [[risk controller]]s take note: if your system is a complex, tightly-coupled system — and it is — ''you cannot solve for systemic failures. You can’t prevent them. You have to have arrangements in place to ''deal'' with them. These arrangements need to be able to deal with the unexpected outputs of a ''[[complex]]'' system, not the predictable effects of a merely ''[[complicated]]'' one.  
===How to deal with [[system accidents]]===
So, financial services [[risk controller]]s take note: if your system is a [[complex]], [[tightly-coupled]] system — and it is — ''you cannot solve for systemic failures''. You can’t prevent them. You have to have arrangements in place to ''deal'' with them. These arrangements need to be able to deal with the unexpected interactions of components in a ''[[complex]]'' system, not the predictable effects of a merely ''[[complicated]]'' one.  


Why make the distinction between [[complex]] and [[complicated]] like this? because pre-configured safety mechanisms — think [[risk taxonomy|risk taxonomies]], [[playbook]]s, [[checklist]]s, [[neural networks]], even ~ ''cough'' ~ [[contract|contractual rights]]s may help resolve isolated failures in ''complicated'' components, but they have ''no'' chance of resolving systems failures. ''They are more likely to get in the way''. They are ''of'' the system. They are ''part'' of what has failed. Not only that: these safety mechanisms, by their existence, ''add'' complexity in the system, and when a system failure happens they can make it ''harder'' to detect what has gone wrong.
Why make the distinction between [[complex]] and [[complicated]] like this? Because we in the financial services industry are in the swoon of automated, pre-configured safety mechanisms — think [[chatbot]]s, [[risk taxonomy|risk taxonomies]], [[playbook]]s, [[checklist]]s, [[neural networks]], even ~ ''cough'' ~ [[contract|contractual rights]]s — and while these may help resolve isolated and expected failures in ''complicated'' components, they have ''no'' chance of resolving systems failures, which, by definition, will confound them. Instead, these safety mechanisms ''will get in the way''. They are ''of'' the system. They are ''part'' of what has failed. Not only that: safety mechanisms, by their existence, ''add'' [[complexity]] in the system — they create their own unexpected interactions — and when a system failure happens they can make it ''harder'' to detect what is going on, much less how to stop it.


===Inadvertent complexity===
===Inadvertent complexity===
So far, so hoopy; but here’s the rub: we can make systems and processes more or less complex and, to an extent, reduce [[tight coupling]] by careful system design and iterative improvement:<ref>Air transport has become progressively less complex as it has developed. It has learned from each accident.</ref> But it is axiomatic that we can’t eliminate complexity altogether.  
So far, so hoopy; but here’s the rub: we can make our systems less complex and, to an extent, reduce [[tight coupling]] by careful design and iterative improvement:<ref>Air transport has become progressively less complex as it has developed. It has learned from each accident.</ref> But it is axiomatic that we can’t eliminate complexity altogether.  And we tend not to simplify. We like to add prepackaged components: policies, processes, rules, and new-fangled bits of kit to the process.


Here is where the folly of [[complicated]] safety mechanisms comes in: adding linear safety systems to a system ''increases'' its complexity, and makes dealing with systems failures, when they occur, even harder. Not only do linear safety mechanisms exacerbate or even create their own accidents, but they also afford a degree of false comfort that encourages managers, who typically have financial targets to meet, not safety ones — to run the system harder, thus increasing the tightness of the coupling between unrelated components. That same Triple A rating that lets your risk officer catch some zeds at the switch encourages your trader to double down. ''I’m covered. What could go wrong?''  
Here is where the folly of [[complicated]] safety mechanisms comes in: adding linear safety systems to a system ''increases'' its complexity, and makes dealing with systems failures, when they occur, even harder. Not only do linear safety mechanisms exacerbate or even create their own accidents, but they also afford a degree of false comfort that encourages managers, who typically have financial targets to meet, not safety ones — to run the system harder, thus increasing the tightness of the coupling between unrelated components. That same Triple A rating that lets your risk officer catch some zeds at the switch encourages your trader to double down. ''I’m covered. What could go wrong?''  


Part of the voyeuristic pleasure of Perrow’s book is the salacious detail with which he documents the sequential failures at Three Mile Island, the Space Shuttle ''Challenger'', Air New Zealand’s Erebus flight, and other disasters and near misses. The chapter on maritime collisions would be positively hilarious were it not so distressing.
Part of the voyeuristic pleasure of Perrow’s book is the salacious detail with which he documents the sequential failures at Three Mile Island, the Space Shuttle ''Challenger'', Air New Zealand’s Erebus flight, among many other disasters and near-misses. The chapter on maritime collisions would be positively hilarious were it not so distressing.


===“Operator error” is almost always the wrong answer===
===“Operator error” is almost always the wrong answer===

Navigation menu