Normal Accidents: Living with High-Risk Technologies: Difference between revisions

no edit summary
No edit summary
No edit summary
Line 55: Line 55:
Human beings being system components, it is rash to blame them when they are component that is constitutionally disposed to fail — we are frail, mortal, inconstant, narratising beings — even when not put in a position, through system design or economic incentive that makes failure inevitable. A ship’s captain who is expected to work a 48-hour watch and meet unrealistic deadlines is hardly positioned, let alone incentivised to prioritise safety. Perrow calls these “forced operator errors”: “But again, “operator error” is an easy classification to make. What really is at stake is an inherently dangerous working situation where production must keep moving and risk-taking is the price of continued employment.”<ref>{{br|Normal Accidents}} p. 249.</ref>  
Human beings being system components, it is rash to blame them when they are component that is constitutionally disposed to fail — we are frail, mortal, inconstant, narratising beings — even when not put in a position, through system design or economic incentive that makes failure inevitable. A ship’s captain who is expected to work a 48-hour watch and meet unrealistic deadlines is hardly positioned, let alone incentivised to prioritise safety. Perrow calls these “forced operator errors”: “But again, “operator error” is an easy classification to make. What really is at stake is an inherently dangerous working situation where production must keep moving and risk-taking is the price of continued employment.”<ref>{{br|Normal Accidents}} p. 249.</ref>  


If an operator’s role is simply to carry out a tricky but routine part of the system then the inevitable march of technology makes this ever more a fault of design and not personnel: humans, we know, are not good computers. They are good at figuring out what to do when something unexpected happens; making decisions; exercising judgment. But they — ''we'' — are ''lousy'' at doing repetitive tasks and following instructions. As ''The Six Million Dollar Man'' had it, ''we have the technology''. We should damn well use it.
If an operator’s role is simply to carry out a tricky but routine part of the system then the march of technology makes this ever more a fault of design and not personnel: humans, we know, are not good computers. They are good at figuring out what to do when something unexpected happens; making decisions; exercising judgment. But they — ''we'' — are ''lousy'' at doing repetitive tasks and following instructions. As ''The Six Million Dollar Man'' had it, ''we have the technology''. We should damn well use it.


If, on the other hand, the operator’s role is to manage ''[[complexity]]'' — then technology, checklists and pre-packaged risk taxonomies can only take you so far and, at the limit, can get in the way. Perrow’s account of the control deck at Three Mile Island, as reactant coolant pumps began cavitating, thumping and shaking, is instructive:
If, on the other hand, the operator’s role is to manage ''[[complexity]]'' — then technology, checklists and pre-packaged risk taxonomies can only take you so far and, at the limit, can get in the way. Perrow’s account of the control deck at Three Mile Island, as reactant coolant pumps began cavitating, thumping and shaking, is instructive: