Normal Accidents: Living with High-Risk Technologies: Difference between revisions

Jump to navigation Jump to search
no edit summary
No edit summary
No edit summary
Tags: Mobile edit Mobile web edit
Line 1: Line 1:
{{a|devil|
{{a|devil|
[[File:Erebus.gif|450px|frameless|center|Air New Zealand Flight TE901]]
[[File:Erebus.gif|450px|frameless|center|Air New Zealand Flight TE901]]
}}This is one of those “books that will change your life”. Well — that ''should'' change lives — that it was written in 1984 — {{author|Charles Perrow}} passed away in 2019 — suggests that, maybe it hasn’t: that the irrationalities that motivate so much of what we do are more pervasive than plainly written common sense.
}}{{Def|Accident||n|}}  An inevitable occurrence due to the action of immutable laws. — {{author|Ambrose Bierce}}, {{br|The Devil’s Dictionary}}
 
This is one of those “books that will change your life”. Well — that ''should'' change lives — that it was written in 1984 — {{author|Charles Perrow}} passed away in 2019 — suggests that, maybe it hasn’t: that the irrationalities that motivate so much of what we do are more pervasive than plainly written common sense.


{{author|Charles Perrow}} was a sociologist who fell into the discipline of [[systems analysis]]: analysing how social structures like businesses, governments and public utilities, being loose networks of autonomous individuals, work. Perrow’s focus fell upon organisations that present specific risks to operators, passengers, innocent bystanders — nuclear and other power stations, airways, shipping lines, but the read-across to the financial systems is obvious — where a combination of what he termed '''[[complexity|complex interactions]]''' and '''[[tight coupling]]''' in distributed systems mean that catastrophic accidents are not just likely but, from time to time, ''inevitable''. Such unpredictable failures are an intrinsic property of a complex, tightly coupled system, not merely a function of “operator error” that can be blamed on a negligent employee — although be assured, that is how management will be [[inclined]] to characterise it if given half a chance.
{{author|Charles Perrow}} was a sociologist who fell into the discipline of [[systems analysis]]: analysing how social structures like businesses, governments and public utilities, being loose networks of autonomous individuals, work. Perrow’s focus fell upon organisations that present specific risks to operators, passengers, innocent bystanders — nuclear and other power stations, airways, shipping lines, but the read-across to the financial systems is obvious — where a combination of what he termed '''[[complexity|complex interactions]]''' and '''[[tight coupling]]''' in distributed systems mean that catastrophic accidents are not just likely but, from time to time, ''inevitable''. Such unpredictable failures are an intrinsic property of a complex, tightly coupled system, not merely a function of “operator error” that can be blamed on a negligent employee — although be assured, that is how management will be [[inclined]] to characterise it if given half a chance.

Navigation menu