Normal Accidents: Living with High-Risk Technologies: Difference between revisions

no edit summary
No edit summary
Tags: Mobile edit Mobile web edit
No edit summary
Tags: Mobile edit Mobile web edit
Line 10: Line 10:
{{author|Charles Perrow}} was a sociologist who fell into the discipline of [[systems analysis]]: analysing how social structures like businesses, governments and public utilities, being loose networks of autonomous individuals, work. Perrow’s focus fell upon organisations that present specific risks to operators, passengers, innocent bystanders — nuclear and other power stations, airways, shipping lines: the read-across to the financial systems is obvious — where a combination of what he termed '''[[complexity|complex interactions]]''' and '''[[tight coupling]]''' in distributed systems mean that catastrophic accidents are not just likely but, from time to time, ''inevitable''. Such unpredictable failures are an intrinsic property of a complex, tightly coupled system, not merely a function of “operator error” that can be blamed on a negligent employee — although be assured, that is how management will be [[inclined]] to characterise it if given half a chance.
{{author|Charles Perrow}} was a sociologist who fell into the discipline of [[systems analysis]]: analysing how social structures like businesses, governments and public utilities, being loose networks of autonomous individuals, work. Perrow’s focus fell upon organisations that present specific risks to operators, passengers, innocent bystanders — nuclear and other power stations, airways, shipping lines: the read-across to the financial systems is obvious — where a combination of what he termed '''[[complexity|complex interactions]]''' and '''[[tight coupling]]''' in distributed systems mean that catastrophic accidents are not just likely but, from time to time, ''inevitable''. Such unpredictable failures are an intrinsic property of a complex, tightly coupled system, not merely a function of “operator error” that can be blamed on a negligent employee — although be assured, that is how management will be [[inclined]] to characterise it if given half a chance.


If this is right, it has profound consequences for how we who inhabit [[complex]], [[tightly-coupled]] systems, should think about risk. If you work in [[financial services]], you ''do'' inhabit a complex, tightly-coupled system, and it seems unarguably right.
The classic case of such a tightly-coupled system is a nuclear power plant. Perrow was an accident investigator at the Three Mile Island incident. The early part of his book contains a fascinating blow-by-blow account of how TMI unfolded and how close it came to being catastrophically worse than it was.
 
Yet while there were no fatalities, it is premature to conclude that the technology is therefore safe.


{{Quote|Large nuclear plants of 1,000 or so megawatts have not been operating very long—only about thirty-five to forty years of operating experience exists, and that constitutes “industrial infancy” for complicated, poorly understood transformation systems.}}
{{Quote|“Large nuclear plants of 1,000 or so megawatts have not been operating very long—only about thirty-five to forty years of operating experience exists, and that constitutes “industrial infancy” for complicated, poorly understood transformation systems.}}


The unnerving practical conclusion that Perrow draws is that, for all the easy speeches given about the relative low risk of nuclear power compared with traditional fossil fuel-based energy generation, it is just far too early to draw any meaningful conclusions about the tail risk of nuclear. The potential for unanticipatable accidents that trigger unstoppable catastrophic chain reactions is incalculable, and the time horizon over which these accidents could occur or have effect is literally millennial. Which traditional industries these risks are better understood and generally less prevalent.
The unnerving practical conclusion that Perrow draws is that, for all the easy speeches<ref name="syed">[https://www.thetimes.co.uk/article/f8a262f8-4490-11ec-b414-b1f6389ab345 We are too emotional about risk — no wonder we make bad decisions]— Matthew Syed, ''The Sunday Times'', 14 November 2021.</ref> given about the relative low risk of nuclear power compared with traditional fossil fuel-based energy generation, it is just far too early to draw any meaningful conclusions about the tail risk of nuclear meltdown. The potential for unanticipatable accidents that trigger unstoppable catastrophic chain reactions is incalculable, and the time horizon over which these accidents could occur or have effect is literally millennial. Which traditional industries these risks are better understood and generally less prevalent.


To claim that the statistics we have suggest nuclear power is is safe<ref>[https://www.thetimes.co.uk/article/f8a262f8-4490-11ec-b414-b1f6389ab345 We are too emotional about risk — no wonder we make bad decisions]— Matthew Syed, ''The Sunday Times'', 14 November 2021.</ref> is to mistake an “absence of evidence” for “evidence of absence”.  
To claim that the statistics we have suggest nuclear power is is safe<ref name="syed"/> is to mistake an “absence of evidence” for “evidence of absence”.  


===Financial services relevance===
===Financial services relevance===
This site is mostly concerned with financial services and not a nuclear energy, of course. You would think [[financial services]] meet exactly the conditions of [[non-linearity]] and [[tight coupling]] this that Perrow describes.
This site is mostly concerned with financial services and not nuclear energy, of course. You would think [[financial services]] meet exactly the conditions of [[non-linearity]] and [[tight coupling]] this that Perrow describes.
 
If this is right, it has profound consequences for how we who inhabit [[complex]], [[tightly-coupled]] systems, should think about risk. If you work in [[financial services]], you ''do'' inhabit a complex, tightly-coupled system, and it seems unarguably right.


Yet you don’t hear many people in [[financial services]] talking about how to handle [[normal accidents]]. Instead you hear a lot about [[technological unemployment]] and how [[chatbot]]s are going to put as all out of work. Hmmm.
Yet you don’t hear many people in [[financial services]] talking about how to handle [[normal accidents]]. Instead you hear a lot about [[technological unemployment]] and how [[chatbot]]s are going to put as all out of work. Hmmm.