Normal Accidents: Living with High-Risk Technologies: Difference between revisions

no edit summary
No edit summary
Tags: Mobile edit Mobile web edit
No edit summary
Tags: Mobile edit Mobile web edit
Line 23: Line 23:
So, financial services [[risk controller]]s take note: if your system is a [[complex]], [[tightly-coupled]] system — and it is — ''you cannot solve for systemic failures''. You can’t prevent them. You have to have arrangements in place to ''deal'' with them. These arrangements need to be able to deal with the unexpected interactions of components in a ''[[complex]]'' system, not the predictable effects of a merely ''[[complicated]]'' one.  
So, financial services [[risk controller]]s take note: if your system is a [[complex]], [[tightly-coupled]] system — and it is — ''you cannot solve for systemic failures''. You can’t prevent them. You have to have arrangements in place to ''deal'' with them. These arrangements need to be able to deal with the unexpected interactions of components in a ''[[complex]]'' system, not the predictable effects of a merely ''[[complicated]]'' one.  


Why make the distinction between [[complex]] and [[complicated]] like this? Because we in the financial services industry are in the swoon of automated, pre-configured safety mechanisms — think [[chatbot]]s, [[risk taxonomy|risk taxonomies]], [[playbook]]s, [[checklist]]s, [[neural networks]], even ~ ''cough'' ~ [[contract|contractual rights]]— and while these may help resolve isolated and expected failures in ''complicated'' components, they have ''no'' chance of resolving systems failures, which, by definition, will confound them. Instead, these safety mechanisms ''will get in the way''. They are ''of'' the system. They are ''part'' of what has failed. Not only that:  safety mechanisms, by their existence, ''add'' [[complexity]] in the system — they create their own unexpected interactions — and when a system failure happens they can make it ''harder'' to detect what is going on, much less how to stop it.
Why make the distinction between [[complex]] and [[complicated]] like this? Because we in the financial services industry are in the swoon of automated, pre-configured safety mechanisms — think [[chatbot]]s, [[risk taxonomy|risk taxonomies]], [[playbook]]s, [[checklist]]s, [[neural networks]], even ~ ''cough'' ~ [[contract|contractual rights]] — and while these may help resolve isolated and expected failures in ''complicated'' components, they have ''no'' chance of resolving systems failures, which, by definition, will confound them. Instead, these safety mechanisms ''will get in the way''. They are ''of'' the system. They are ''part'' of what has failed. Not only that:  safety mechanisms, by their existence, ''add'' [[complexity]] in the system — they create their own unexpected interactions — and when a system failure happens they can make it ''harder'' to detect what is going on, much less how to stop it.


===When Kramer hears about this ...===
===When Kramer hears about this ...===