82,911
edits
Amwelladmin (talk | contribs) No edit summary |
Amwelladmin (talk | contribs) No edit summary |
||
Line 44: | Line 44: | ||
===“Operator error” is almost always the wrong answer=== | ===“Operator error” is almost always the wrong answer=== | ||
Human beings being system components, it is rash to blame them when they are component that is constitutionally disposed to fail — we are frail, mortal, inconstant, narratising beings — even when not put in a position, through system design or economic incentive that makes failure inevitable. A ship’s captain who is expected to work a 48-hour watch and meet unrealistic deadlines is hardly positioned, let alone incentivised to prioritise safety. Perrow calls these “forced operator errors” | Human beings being system components, it is rash to blame them when they are component that is constitutionally disposed to fail — we are frail, mortal, inconstant, narratising beings — even when not put in a position, through system design or economic incentive that makes failure inevitable. A ship’s captain who is expected to work a 48-hour watch and meet unrealistic deadlines is hardly positioned, let alone incentivised to prioritise safety. Perrow calls these “forced operator errors”: “But again, “operator error” is an easy classification to make. What really is at stake is an inherently dangerous working situation where production must keep moving and risk-taking is the price of continued employment.”<ref>{{br|Normal Accidents}} p. 249.</ref> | ||
: | |||
If an operator’s role is simply to carry out a tricky but routine part of the system then the inevitable march of technology makes this ever more a fault of design and not personnel: humans, we know, are not good computers. They are good at figuring out what to do when something unexpected happens; making decisions; exercising judgment. But they — ''we'' — are ''lousy'' at doing repetitive tasks and following instructions. As ''The Six Million Dollar Man'' had it, ''we have the technology''. We should damn well use it. | If an operator’s role is simply to carry out a tricky but routine part of the system then the inevitable march of technology makes this ever more a fault of design and not personnel: humans, we know, are not good computers. They are good at figuring out what to do when something unexpected happens; making decisions; exercising judgment. But they — ''we'' — are ''lousy'' at doing repetitive tasks and following instructions. As ''The Six Million Dollar Man'' had it, ''we have the technology''. We should damn well use it. |