The Field Guide to Human Error Investigations: Difference between revisions

Jump to navigation Jump to search
no edit summary
No edit summary
No edit summary
Line 3: Line 3:


There are two ways of looking at system accidents:
There are two ways of looking at system accidents:
*'''It’s the [[meatware]]''': [[Complex]] (and [[complicated]]) systems would be fine ''were it not for the [[meatware]] screwing things up''. Human error is the main contributor to most system accidents, introduce unexpected failures into an essentially robust mechanism.  Here, system design and management direction and oversight are otherwise effective strategies that are let down by unreliable human operators.
*'''It’s the [[meatware]]''': [[Complex]] (and [[complicated]]) systems would be fine ''were it not for the [[meatware]] screwing things up''. Human error is the main contributor to most system accidents, introduce unexpected failures into an essentially robust mechanism.  Here, system design and management direction and oversight are otherwise effective strategies that are let down by unreliable human operators.
*'''It’s the system''': Accidents are an inevitable by-product of operators doing the best they can within [[complex]] systems that contain unpredictable vulnerabilities, where risks shift and change over time and priorities are unclear, conflicting and variable. Here, human operators are let down by shortcomings in system design and conflicting management pressure.
*'''It’s the system''': Accidents are an inevitable by-product of operators doing the best they can within [[complex]] systems that contain unpredictable vulnerabilities, where risks shift and change over time and priorities are unclear, conflicting and variable. Here, human operators are let down by shortcomings in system design and conflicting management pressure.
===Blame the [[meatware]]===
===Blame the [[meatware]]===
Line 25: Line 25:
Reactions to accidents tend to be:
Reactions to accidents tend to be:
*'''Retrospective''': made with the benefit of hindsight and full knowledge of inputs and outputs, and with ample time to construct a narrative that neatly links events into a causal chain — a causal chain that was not apparent to the actors at the time, and maybe an impressive, but all the same imaginary, feat of imagination from a [[middle manager]] not normally known for their creative skills.
*'''Retrospective''': made with the benefit of hindsight and full knowledge of inputs and outputs, and with ample time to construct a narrative that neatly links events into a causal chain — a causal chain that was not apparent to the actors at the time, and maybe an impressive, but all the same imaginary, feat of imagination from a [[middle manager]] not normally known for their creative skills.
*'''Proximal''': Focussing on people at the sharp end — the operators, traders, negotiators — the [[meatware]] closest to the accident, and not so much on the blunt end — the organisation, its goals, target end-states and its strategic management of the manufacturing process (in particular how it balances risk and reward) the provided tools and equipment, rules and [[policy|policies]] promulgated and the constraints and pressures imposed on those [[subject matter expert]]s to get the job done.
*'''Counterfactuals''': constructing alternative sequences of events — where operators “zigged”, but could have “zagged” — which might have avoided the incident. “''Forks in the road stand out so clearly to you, looking back. But when inside the tunnel, when looking forward and being pushed ahead by unfolding events, these forks were shrouded in the uncertainty and [[complexity]] of many possible options and demands; they were surrounded by time constraints and other pressures.”
*'''Judgmental''': To explain failure we seek failure: incorrect analyses, mistaken perceptions, misjudged actions. Again, hindsight is king. In each case, if you presented the operator with the facts as they were available to the investigator, in the same unpressurised environment, you might expect the “correct” outcome.
===Common canards====
*'''Cause-consequence equivalence''': The assumption that a bad outcome must have had equally ''bad'' causes and, seeing as management-mandated and properly governed processes are unlikely to have been the process of really bad governance —we bureaucratised the shit out of that, after all — therefore the malignant cause ''must'' be the fault of a bad apple somewhere.
But bad outcomes are ''not'' necessarily caused by equally bad inputs: the [[Three Mile Island]] disaster was a concatenation of seemingly insignificant and benign, but unusual, events.


{{Sa}}
{{Sa}}

Navigation menu