Normal Accidents: Living with High-Risk Technologies: Difference between revisions

no edit summary
No edit summary
No edit summary
Line 7: Line 7:
First, some definitions.  
First, some definitions.  
*'''Complexity''': Perrow anticipates the later use of the concept of “[[complexity]]” — a topic which is beginning to infuse the advocacy part of this site — without the benefit of [[systems analysis]], since it hadn’t really been invented when he was writing, but to describe interactions between non-adjacent subcomponents of a system that were neither intended nor anticipated by the designers of the system. Complex interactions are not only unexpected, but for a period of time (which may be critical, if the interacting components are [[tightly coupled]]) will be ''incomprehensible''. This may be because the interactions cannot be seen, buried under second-order control and safety systems, or even because they are not ''believed''.  If your  — ''wrong'' — theory of the game is that the risk in question is a [[ten sigma event]],  expected only once in one hundred million years,  you may have a hard time believing it could be happening in your fourth year of operation, as the partners of [[Long Term Capital Management]] may tell you. Here even [[epistemology]] is in play. Interactions that were not in our basic conceptualisation the world, are not ones we can reasonably anticipate. These interactions were, QED, not ''designed'' into the system; no one ''intended'' them. “They baffle us because we acted in terms of our own designs of a world that we expected to exist—but the world was different.”<ref>{{br|Normal Accidents}}, p. 75. Princeton University Press. Kindle Edition. </ref>
*'''Complexity''': Perrow anticipates the later use of the concept of “[[complexity]]” — a topic which is beginning to infuse the advocacy part of this site — without the benefit of [[systems analysis]], since it hadn’t really been invented when he was writing, but to describe interactions between non-adjacent subcomponents of a system that were neither intended nor anticipated by the designers of the system. Complex interactions are not only unexpected, but for a period of time (which may be critical, if the interacting components are [[tightly coupled]]) will be ''incomprehensible''. This may be because the interactions cannot be seen, buried under second-order control and safety systems, or even because they are not ''believed''.  If your  — ''wrong'' — theory of the game is that the risk in question is a [[ten sigma event]],  expected only once in one hundred million years,  you may have a hard time believing it could be happening in your fourth year of operation, as the partners of [[Long Term Capital Management]] may tell you. Here even [[epistemology]] is in play. Interactions that were not in our basic conceptualisation the world, are not ones we can reasonably anticipate. These interactions were, QED, not ''designed'' into the system; no one ''intended'' them. “They baffle us because we acted in terms of our own designs of a world that we expected to exist—but the world was different.”<ref>{{br|Normal Accidents}}, p. 75. Princeton University Press. Kindle Edition. </ref>
*'''[[Linear interactions]]''': Contrast [[complex interactions]] with much more common “[[linear interactions]]”, where parts of the system interact with other components that precede or follow them in the system in ways that are expected and planned. In a well-designed system, these will (of course) predominate: any decent system should mainly do what it is designed to do and not act erratically in normal operation. Some systems are more complex than others, but even in the most linear systems are susceptible to some complexity — where they interact with the environment.<ref>Perrow characterises a “complex system” as one where ten percent of interactions are complex; and a “linear system” where less than one percent or interactions are complex. The greater the percentage of complex interactions in a system, the greater the potential for system accidents.</ref>
*'''[[Linear interactions]]''': Contrast [[complex interactions]] with much more common “[[linear interactions]]”, where parts of the system interact with other components that precede or follow them in the system in ways that are expected and planned: “if ''this'', then ''that''”. In a well-designed system, these will (of course) predominate: any decent system should mainly do what it is designed to do and not act erratically in normal operation. Some systems are more complex than others, but even in the most linear systems are susceptible to some complexity — where they interact with the environment.<ref>Perrow characterises a “complex system” as one where ten percent of interactions are complex; and a “linear system” where less than one percent or interactions are complex. The greater the percentage of complex interactions in a system, the greater the potential for system accidents.</ref> Cutting back into the language of [[systems analysis]] for a moment, consider that [[linear interaction]]s are a ''feature'' of [[simple]] and [[complicated system]]s, and can be “pre-solved” and brute-force computed; at least in theory. They can be managed by [[algorithm]], or [[playbook]]. But [[complex interactions]], by definition, ''cannot'' — they are the interactions the [[algorithm]] ''didn’t expect''.
*'''[[Tight coupling]]''': However complex interactions are only a source of catastrophe if another condition is satisfied: that they are “tightly coupled” — processes happen fast, can’t be turned off, failing components can’t be isolated. Perrow’s observation is that systems tend to be more tightly coupled than we realise.
*'''[[Tight coupling]]''': However complex interactions are only a source of catastrophe if another condition is satisfied: that they are “tightly coupled” — processes happen fast, can’t be turned off, failing components can’t be isolated. Perrow’s observation is that systems tend to be more tightly coupled than we realise.


Cutting back into the language of [[systems analysis]] for a moment, consider this: [[linear interaction]]s are a ''feature'' of [[simple]] or [[complicated system]]s: “if ''this'', then ''that''”. Linear interactons can be “solved” in advance by pre-configuration. They can be brute-force computed; at least in theory. They can be managed by [[algorithm]]. [[Complex interactions]], by definition, ''cannot'' — they are the interactions the [[algorithm]] ''didn’t expect''.
===Normal accidents===
 
Where you have a complex system, we should ''expect'' accidents — and opportunities, quirks and serendipities —to arise from unexpected, non-linear interactions are, says Perrow, “normal”, not in the sense of being regular or expected — in the forty-year operating history of nuclear power stations, there had (at the time of writing!) been no catastrophic meltdowns<ref>“... but his constitutes only an “industrial infancy” for complicated, poorly understood transformation systems.” Perrow had a chilling prediction: “But the ingredients for such accidents are there, and unless we are very lucky, one or more will appear in the next decade and breach containment.” Ouch.</ref>there had, but in the sense that it is an inherent property of the system to have this kind of accident. Financial services [[risk manager]]s take note: you can’t solve for these kinds of accidents. You can’t prevent them. You have to have arrangements in place to deal with them. And these arrangements need to be designed to deal with the unexpected outputs of a ''[[complex]]'' system, not the predictable effects of a merely ''[[complicated]]'' one.
Accidents arising from unexpected, non-linear interactions are, says Perrow, “normal”, not in the sense of being regular or expected — in the forty-year operating history of nuclear power stations, there had (at the time of writing!) been no catastrophic meltdowns<ref>“... but his constitutes only an “industrial infancy” for complicated, poorly understood transformation systems.” Perrow had a chilling prediction: “But the ingredients for such accidents are there, and unless we are very lucky, one or more will appear in the next decade and breach containment.” Ouch.</ref>there had, but in the sense that it is an inherent property of the system to have this kind of accident. Financial services [[risk manager]]s take note: you can’t solve for these kinds of accidents. You can’t prevent them. You have to have arrangements in place to deal with them. And these arrangements need to be designed to deal with the unexpected outputs of a ''[[complex]]'' system, not the predictable effects of a merely ''[[complicated]]'' one.


===Inadvertent complexity===
===Inadvertent complexity===