The Field Guide to Human Error Investigations: Difference between revisions

no edit summary
No edit summary
No edit summary
 
(7 intermediate revisions by the same user not shown)
Line 1: Line 1:
{{A|book review|}}
{{A|book review|{{image|field guide|jpg|}}}}{{br|The Field Guide to Human Error Investigations}}<br>{{author|Sidney Dekker}}{{c|Systems theory}}
Of a piece with {{author|Charles Perrow}}’s {{br|Normal Accidents}}, {{author|Sidney Dekker}}’s book is compelling in rooting the cause of accidents in poor system design and unnecessary complexity, overlaying safety features and compliance measures which only make the problem worse — that is, at the door of management and not poor benighted [[subject matter expert]]s who are expected to make sense of the {{author|Rube Goldberg}} [[Heath Robinson machine|machine]] that management expect them to operate.
===More on systems accidents===
Of a piece with {{author|Charles Perrow}}’s {{br|Normal Accidents}}, {{author|Sidney Dekker}}’s book is compelling in rooting the cause of accidents in poor system design and unnecessary complexity, overlaying safety features and compliance measures which only make the problem worse — that is, at the door of management and not poor benighted [[subject matter expert]]s who are expected to make sense of the [[Rube Goldberg machine]] that management expect them to operate.


There are two ways of looking at system accidents:
There are two ways of looking at system accidents:
Line 18: Line 19:
*'''[[Downgrading]] employees''': ''removing'' subject matter experts and replacing them with lower calibre (i.e., cheaper) employees with ''even less'' autonomy to follow the ''even more complicated'' rules and processes now introduced.
*'''[[Downgrading]] employees''': ''removing'' subject matter experts and replacing them with lower calibre (i.e., cheaper) employees with ''even less'' autonomy to follow the ''even more complicated'' rules and processes now introduced.


But blaming the [[meatware]] is to ignore history and be condemned yourself to repeat it. Changing the make-up of your operational workforce won’t make much difference if you leave the basic conditions under which they were obliged to operate unaddressed. Just adding more, increasingly detailed, policies — “codified over-reactions to situations that are unlikely to happen again” in {{author|Jason Fried}}’s elegant words<ref>{{author|Jason Fried}}, {{br|ReWork: Change the Way You Work Forever}}</ref> — will only make the gap between theory and practice wider.
But blaming the [[meatware]] is to ignore history and condemn yourself to repeat it. Changing the make-up of your workforce won’t help if the basic conditions under which they are obliged to operate aren’t fixed. Simply adding more, increasingly detailed, policies — “codified over-reactions to situations that are unlikely to happen again” in {{author|Jason Fried}}’s elegant words<ref>{{author|Jason Fried}}, {{br|ReWork: Change the Way You Work Forever}}</ref> — will only make the gap between theory and practice wider.
 
===The work to rule as falsification of policy===
{{Work to rule capsule}}
{{Work to rule capsule}}


===Reacting to failure===
===Reacting to failure===
Reactions to accidents tend to be:
Reactions to accidents tend:
*'''Retrospective''': made with the benefit of hindsight and full knowledge of inputs and outputs, and with ample time to construct a narrative that neatly links events into a causal chain — a causal chain that was not apparent to the actors at the time, and maybe an impressive, but all the same imaginary, feat of imagination from a [[middle manager]] not normally known for their creative skills.
*'''Retrospective''': To be made with the benefit of hindsight and full knowledge of inputs and outputs, and with ample time to construct a [[narrative]] that neatly links events into a causal chain ''that was not at all clear to the actors at the time''. This may be an impressive feat of imagination from a [[middle manager]] not normally known for their creative skills, but it ''is'' a feat of imagination.
*'''Proximal''': Focussing on people at the sharp end — the operators, traders, negotiators — the [[meatware]] closest to the accident, and not so much on the blunt end — the organisation, its goals, target end-states and its strategic management of the manufacturing process (in particular how it balances risk and reward) the provided tools and equipment, rules and [[policy|policies]] promulgated and the constraints and pressures imposed on those [[subject matter expert]]s to get the job done.
*'''Proximal''': To blame the [[meatware]] at the sharp end, closest to the accident — the operators, traders, [[negotiator]]s — and not so much on those at the “blunt end” — the executive, its goals, target end-states, its strategic management of the process, how it balances risk and reward, what tools  tools and equipment it provides, its rules and [[policy|policies]] and the constraints and pressures it  imposes on those [[subject matter expert]]s to get the job done.
*'''Counterfactuals''': constructing alternative sequences of events — where operators “zigged”, but could have “zagged” — which might have avoided the incident. “''Forks in the road stand out so clearly to you, looking back. But when inside the tunnel, when looking forward and being pushed ahead by unfolding events, these forks were shrouded in the uncertainty and [[complexity]] of many possible options and demands; they were surrounded by time constraints and other pressures.”
*'''Counterfactuals''': To construct alternative sequences of events — where operators “zigged”, but could have “zagged” — which might have avoided the incident. “''Forks in the road stand out so clearly to you, looking back. But when inside the tunnel, when looking forward and being pushed ahead by unfolding events, these forks were shrouded in the uncertainty and [[complexity]] of many possible options and demands; they were surrounded by time constraints and other pressures.”
*'''Judgmental''': To explain failure we seek failure: incorrect analyses, mistaken perceptions, misjudged actions. Again, hindsight is king. In each case, if you presented the operator with the facts as they were available to the investigator, in the same unpressurised environment, you might expect the “correct” outcome.
*'''Judgmental''': To explain failure by seeking failure: incorrect analyses, mistaken perceptions, misjudged actions. Again, hindsight is king. In each case, if you presented the operator with the facts as they were available to the investigator, in the same unpressurised environment, you might expect the “correct” outcome.


====Common canards====
====Common canards====