Normal Accidents: Living with High-Risk Technologies: Difference between revisions

From The Jolly Contrarian
Jump to navigation Jump to search
No edit summary
No edit summary
Line 4: Line 4:


If it is right, it has profound consequences for how we in complex, tightly coupled systems, should think about risk. It seems inarguably right.
If it is right, it has profound consequences for how we in complex, tightly coupled systems, should think about risk. It seems inarguably right.
 
===[[Complex]] interactions and [[tight coupling]]===
First, some definitions. Perrow uses “[[complexity]]” — a topic which is beginning to infuse the advocacy part of this site — without the benefit of [[systems analysis]], since it hadn’t really been invented when he was writing, but to describe interactions between non-adjacent subcomponents of a system that were neither intended nor anticipated by the designers of the system.
First, some definitions. Perrow uses “[[complexity]]” — a topic which is beginning to infuse the advocacy part of this site — without the benefit of [[systems analysis]], since it hadn’t really been invented when he was writing, but to describe interactions between non-adjacent subcomponents of a system that were neither intended nor anticipated by the designers of the system.
:''These represent interactions that were not in our original design of our world, and interactions that we as “operators” could not anticipate or reasonably guard against. What distinguishes these interactions is that they were not designed into the system by anybody; no one intended them to be linked. They baffle us because we acted in terms of our own designs of a world that we expected to exist—but the world was different.''<ref>{{br|Normal Accidents}}, p. 75. Princeton University Press. Kindle Edition. </ref>
:''These represent interactions that were not in our original design of our world, and interactions that we as “operators” could not anticipate or reasonably guard against. What distinguishes these interactions is that they were not designed into the system by anybody; no one intended them to be linked. They baffle us because we acted in terms of our own designs of a world that we expected to exist—but the world was different.''<ref>{{br|Normal Accidents}}, p. 75. Princeton University Press. Kindle Edition. </ref>
Contrast these with much more common “linear interactions”, where parts of the system interact with other components that precede or follow them in the system in ways that are expected and planned. In a well-designed system, these will (of course) predominate: any decent system should mainly do what it is designed to do and not act erratically in normal operation. Some systems are more complex than others, but even in the most linear systems are susceptible to some complexity — where they interact with the environment.<ref>Perrow characterises a “complex system” as one where ten percent of interactions are complex; and a “linear system” where less than one percent or interactions are complex. The greater the percentage of compex interactions in a system, the greater the potential for system accidents.</ref>
Contrast these with much more common “linear interactions”, where parts of the system interact with other components that precede or follow them in the system in ways that are expected and planned. In a well-designed system, these will (of course) predominate: any decent system should mainly do what it is designed to do and not act erratically in normal operation. Some systems are more complex than others, but even in the most linear systems are susceptible to some complexity — where they interact with the environment.<ref>Perrow characterises a “complex system” as one where ten percent of interactions are complex; and a “linear system” where less than one percent or interactions are complex. The greater the percentage of complex interactions in a system, the greater the potential for system accidents.</ref>
 
However complex interactions are only a source of catastrophe if another condition is satisfied: that they are “tightly coupled” — processes happen fast, can’t be turned off, failing components can’t be isolated. Perrow’s observation is that systems tend to be more tightly coupled than we realise.
 
Cutting back into the language of [[systems analysis]] for a moment, consider this: [[linear interaction]]s are a feature of [[simple]] or [[complicated system]]s: they can be “solved” in advance by pre-configuration. They can be [[brute-force]] computed; at least in theory. They can be managed by [[algorithm]]. [[Complex interactions]], by definition, can’t — they are the interactions the [[algorithm]] ''didn’t expect''.
 
===Inadvertent complexity===
So far so hoopy; but here’s the rub: we can make systems and processes more or less complex and, to an extent, reduce tight coupling by careful system design. But adding safety systems to a system ''increases'' its complexity.


However complex interactions are only a source of catastophe if another condition is satisfied: that they are “tightly coupled” — processes happen fast, can’t be turned off, failing components can’t be isolated
{{sa}}
{{sa}}
*[[Complexity]]
*[[Complexity]]
{{ref}}
{{ref}}

Revision as of 12:22, 31 August 2020

In which the curmudgeonly old sod puts the world to rights.
Index — Click ᐅ to expand:
Tell me more
Sign up for our newsletter — or just get in touch: for ½ a weekly 🍺 you get to consult JC. Ask about it here.

This is one of those “books that will change your life”. Well — that should change lives — that it was written in 1984 — Charles Perrow passed away in 2019 — suggests that, maybe it hasn’t: that the irrationalities that motivate so much of what we do are more pervasive than plainly written common sense.

Charles Perrow was a sociologist who fell into the discipline of systems analysis: analysing how social structures like businesses, governments and public utilities, being loose networks of autonomous individuals, work. Perrow’s focus fell upon organisations that present specific risks to operators, passengers, innocent bystanders — nuclear and other power stations, airways, shipping lines, but the read-across to the financial systems is obvious — where a combination of complexity and tight coupling mean that periodic catastrophic accidents are not just likely, but inevitable. It is the intrinsic property of a complex, tightly coupled system — not merely a function of operator error that can be blamed on a negligent employee — that it will fail catastrophically.

If it is right, it has profound consequences for how we in complex, tightly coupled systems, should think about risk. It seems inarguably right.

Complex interactions and tight coupling

First, some definitions. Perrow uses “complexity” — a topic which is beginning to infuse the advocacy part of this site — without the benefit of systems analysis, since it hadn’t really been invented when he was writing, but to describe interactions between non-adjacent subcomponents of a system that were neither intended nor anticipated by the designers of the system.

These represent interactions that were not in our original design of our world, and interactions that we as “operators” could not anticipate or reasonably guard against. What distinguishes these interactions is that they were not designed into the system by anybody; no one intended them to be linked. They baffle us because we acted in terms of our own designs of a world that we expected to exist—but the world was different.[1]

Contrast these with much more common “linear interactions”, where parts of the system interact with other components that precede or follow them in the system in ways that are expected and planned. In a well-designed system, these will (of course) predominate: any decent system should mainly do what it is designed to do and not act erratically in normal operation. Some systems are more complex than others, but even in the most linear systems are susceptible to some complexity — where they interact with the environment.[2]

However complex interactions are only a source of catastrophe if another condition is satisfied: that they are “tightly coupled” — processes happen fast, can’t be turned off, failing components can’t be isolated. Perrow’s observation is that systems tend to be more tightly coupled than we realise.

Cutting back into the language of systems analysis for a moment, consider this: linear interactions are a feature of simple or complicated systems: they can be “solved” in advance by pre-configuration. They can be brute-force computed; at least in theory. They can be managed by algorithm. Complex interactions, by definition, can’t — they are the interactions the algorithm didn’t expect.

Inadvertent complexity

So far so hoopy; but here’s the rub: we can make systems and processes more or less complex and, to an extent, reduce tight coupling by careful system design. But adding safety systems to a system increases its complexity.

See also

References

  1. Normal Accidents, p. 75. Princeton University Press. Kindle Edition.
  2. Perrow characterises a “complex system” as one where ten percent of interactions are complex; and a “linear system” where less than one percent or interactions are complex. The greater the percentage of complex interactions in a system, the greater the potential for system accidents.