Normal Accidents: Living with High-Risk Technologies
|
This is one of those “books that will change your life”. Well — that should change lives — that it was written in 1984 — Charles Perrow passed away in 2019 — suggests that, maybe it hasn’t: that the irrationalities that motivate so much of what we do are more pervasive than plainly written common sense.
Charles Perrow was a sociologist who fell into the discipline of systems analysis: analysing how social structures like businesses, governments and public utilities, being loose networks of autonomous individuals, work. Perrow’s focus fell upon organisations that present specific risks to operators, passengers, innocent bystanders — nuclear and other power stations, airways, shipping lines, but the read-across to the financial systems is obvious — where a combination of complexity and tight coupling mean that periodic catastrophic accidents are not just likely, but inevitable. It is the intrinsic property of a complex, tightly coupled system — not merely a function of operator error that can be blamed on a negligent employee — that it will fail catastrophically.
If it is right, it has profound consequences for how we in complex, tightly coupled systems, should think about risk. It seems inarguably right.
First, some definitions. Perrow uses “complexity” — a topic which is beginning to infuse the advocacy part of this site — without the benefit of systems analysis, since it hadn’t really been invented when he was writing, but to describe interactions between discrete subsystems of an organisation that were not, and could not have been anticipated by the designers of the system.