Template:M intro crime tunnel vision: Difference between revisions

From The Jolly Contrarian
Jump to navigation Jump to search
No edit summary
 
(28 intermediate revisions by the same user not shown)
Line 1: Line 1:
{{quote|
By tunnel vision, we mean that “compendium of common [[Heuristic|heuristics]] and logical fallacies,” to which we are all susceptible, that lead actors in the criminal justice system to “focus on a suspect, select and filter the evidence that will “build a case” for conviction, while ignoring or suppressing evidence that points away from guilt.” This process leads investigators, prosecutors, judges, and defence lawyers alike to focus on a particular conclusion and then filter all evidence in a case through the lens provided by that conclusion.
: — ''The Multiple Dimensions of Tunnel Vision in Criminal Cases'' by Keith Findley and Michael Scott (2006)}}
{{d|Prosecutor’s tunnel vision|/ˈprɒsɪkjuːtəz/ /ˈtʌnᵊl/ /ˈvɪʒᵊn/|n}}
{{d|Prosecutor’s tunnel vision|/ˈprɒsɪkjuːtəz/ /ˈtʌnᵊl/ /ˈvɪʒᵊn/|n}}
{{drop|T|he collection of}} biases and cognitive gin-traps that can lead ''prosecutors'' — those who “prosecute” a particular theory of the world — to stick with it, however starkly it may vary from available evidence and common sense.
{{drop|T|he collection of}} biases and cognitive gin-traps that can lead ''prosecutors'' — those who “prosecute” a particular theory of the world — to stick with it, however starkly it may vary from available evidence and common sense.
Line 7: Line 4:
So named because it is often ''literal'' prosecutors, of crimes, who suffer from it. This kind of tunnel vision has led to notorious miscarriages of justice where innocent people come to be convicted notwithstanding clear and plausible alternative explanations for their ostensible “crimes”.  
So named because it is often ''literal'' prosecutors, of crimes, who suffer from it. This kind of tunnel vision has led to notorious miscarriages of justice where innocent people come to be convicted notwithstanding clear and plausible alternative explanations for their ostensible “crimes”.  


The same tunnel vision also motivates ideologies, conspiracies and management philosophy: 360-degree [[performance appraisal]]s, [[outsourcing]], the war on drugs; the worldwide [[Anti-money laundering|AML]] military-industrial complex: are all cases where those “prosecuting” the theory stick with it even though the weight of evidence suggests it does not work and may even be counterproductive.  
{{quote|
By tunnel vision, we mean that “compendium of common [[Heuristic|heuristics]] and logical fallacies,” to which we are all susceptible, that lead actors in the criminal justice system to “focus on a suspect, select and filter the evidence that will “build a case” for conviction, while ignoring or suppressing evidence that points away from guilt.” This process leads investigators, prosecutors, judges, and defence lawyers alike to focus on a particular conclusion and then filter all evidence in a case through the lens provided by that conclusion.
: — ''The Multiple Dimensions of Tunnel Vision in Criminal Cases'' by Keith Findley and Michael Scott (2006)}}
 
{{drop|T|he same tunnel}} vision also motivates ideologies, conspiracies and management philosophy: 360-degree [[performance appraisal]]s, [[outsourcing]], the war on drugs; the worldwide [[Anti-money laundering|AML]] military-industrial complex: are all cases where those “prosecuting” the theory stick with it even though the weight of evidence suggests it does not work and may even be counterproductive.  


The “prosecutor’s tunnel” begins with clear but simplistic — ''misleading'' — models of a messy world. Humans have a weakness for these: we are pattern-matching, puzzle-solving animals. We are drawn to neatness. We resile from intractability as it indicates ''weakness'': that our frail human intellect has been defeated by the ineffable natural order of things.  
The “prosecutor’s tunnel” begins with clear but simplistic — ''misleading'' — models of a messy world. Humans have a weakness for these: we are pattern-matching, puzzle-solving animals. We are drawn to neatness. We resile from intractability as it indicates ''weakness'': that our frail human intellect has been defeated by the ineffable natural order of things.  
Line 13: Line 14:
{{drop|S|ometimes the sheer}} elegance of a prosecutor’s case can crowd out common sense and the basic intuition that ''this cannot be right''.   
{{drop|S|ometimes the sheer}} elegance of a prosecutor’s case can crowd out common sense and the basic intuition that ''this cannot be right''.   


We have built our legal institutions to be vulnerable to this kind of crowding out. Criminal law proceeds upon [[data]] and the weight of ''evidence'' but disallows “intuition”. Hence, there is an asymmetry: evidence is better at saying what ''did'' happen than what did ''not''. This is especially so where there is no direct evidence that the defendant actually did what she is accused of.
We have built our legal institutions to be vulnerable to this kind of crowding out. Criminal law proceeds upon [[data]] and the weight of ''evidence'' but disallows “intuition”. Hence, there is an asymmetry: evidence is better at saying what ''did'' happen than what did ''not''. This is especially so where there is no [[direct evidence]] that the defendant actually did what she is accused of.


Circumstantial evidence does not directly implicate a defendant but is [[consistent with]] the prosecution theory. It accumulates: if there is enough of it, and none points away from the defendant, it can tell us something. But, [[correlation|correlation and causation]]: evidence that is “[[consistent with]]” a prosecution theory does not prove it: that JC owns a bicycle is ''consistent'' with his competing in the ''Tour de France''; it does not make him any more likely to ''do'' it. Evidence can look more meaningful than it is. This is where intuition ought to be able to help us.
Circumstantial evidence does not directly implicate a defendant but is [[consistent with]] the prosecution theory. It accumulates: if there is enough of it, and none points away from the defendant, it can tell us something. But, [[Correlation|correlation and causation]]: evidence that is “[[consistent with]]” a prosecution theory does not prove it: that JC owns a bicycle is ''consistent'' with his competing in the ''Tour de France''; it does not make him any more likely to ''do'' it. Evidence can look more meaningful than it is. This is where intuition ought to be able to help us.


As it is, intuition’s role is relegated to underpinning the presumption of innocence. A prosecutor must prove guilt; the accused need not prove ''anything'': she cannot be expected to explain what happened for the simple reason that and innocent person should have no better idea about it than anyone else. The jury, we hope, leans on its intuition when conjuring doubts.
As it is, intuition’s role is relegated to underpinning the presumption of innocence. A prosecutor must prove guilt; the accused need not prove ''anything'': she cannot be expected to explain what happened for the simple reason that and innocent person should have no better idea about it than anyone else. The jury, we hope, leans on its intuition when conjuring doubts.
Line 21: Line 22:
Experience tells us otherwise. In what follows, JC takes three notorious cases from the antipodes to see what can happen when, with no direct evidence, those arguing the case become afflicted with tunnel vision, and intuition and common sense are relegated behind “data” and circumstantial evidence. Then we will look at what causes this condition.
Experience tells us otherwise. In what follows, JC takes three notorious cases from the antipodes to see what can happen when, with no direct evidence, those arguing the case become afflicted with tunnel vision, and intuition and common sense are relegated behind “data” and circumstantial evidence. Then we will look at what causes this condition.


*Case study: [[Lindy Chamberlain]]
{{gbullet|Case study: [[Lindy Chamberlain]]<li>Case study: [[Peter Ellis]]<li>Case study: [[David Bain]]}}
*Case study: [[Peter Ellis]]
*Case study: [[David Bain]]
 
 
 
====Narrative biases====
====Narrative biases====
{{drop|T|hese cases illustrate}} the problem of relying on circumstantial evidence: with no independent ''direct'' evidence, one tends to start with a hypothesis and fit whatever secondary and forensic evidence you have into it, discarding whatever does not fit. This is the classic [[Prosecutor’s tunnel vision|tunnel vision]] scenario. It can afflict those who would ''defend'' suspects just as firmly as those who prosecute them.
{{drop|T|hese cases illustrate}} the problem of relying on circumstantial evidence: with no independent ''direct'' evidence, one tends to start with a hypothesis and fit whatever secondary and forensic evidence you have into it, discarding whatever does not fit. This is the classic [[Prosecutor’s tunnel vision|tunnel vision]] scenario. It can afflict those who would ''defend'' suspects just as firmly as those who prosecute them.
Line 62: Line 58:
So if a strange confluence of events is accompanied by ''no'' smoking pistol, this too has some prior probability value.  It does not ''exclude'' the possibility of foul play, but it does make it ''less likely''.  
So if a strange confluence of events is accompanied by ''no'' smoking pistol, this too has some prior probability value.  It does not ''exclude'' the possibility of foul play, but it does make it ''less likely''.  


People do not often flip, overnight and without warning, from conscientious citizens to compulsive criminals. If they did, we would ''notice'' it.<ref>They might snap into a sudden orgy extreme violence — but this plays out as desperate, meltdown ''mass'' murder, not calculated ongoing ''serial'' murder, and there is generally no doubt that it is murder and no shortage of [[direct evidence]] implicating the accused.</ref> When hitherto law-abiding people do slide into criminality, there is generally motivation, a history of antisocial behaviour, identifiable psychological trauma, drug dependency, observed personality change over time or diagnosed mental illness.<ref>Mental illnesses having a clear medical pathology, not suspiciously made-up ones out of ex- post facto symptoms like “Munchausen by proxy”. See the “circular correspondence bias” discussion below.</ref> Often ''all'' of these things. (Let us call them “criminal propensities”.)
People do not often flip, overnight and without warning, from conscientious citizens to compulsive criminals. If they did, we would ''notice'' it.<ref>They might snap into a sudden orgy extreme violence — but this plays out as desperate, meltdown ''mass'' murder, not calculated ongoing ''serial'' murder, and there is generally no doubt that it is murder and no shortage of [[direct evidence]] implicating the accused.</ref> When hitherto law-abiding people do slide into criminality, there is generally motivation, a history of antisocial behaviour, identifiable psychological trauma, drug dependency, observed personality change over time or diagnosed mental illness.<ref>Mental illnesses having a clear medical pathology, not suspiciously made-up ones out of ex- post facto symptoms like “Munchausen by proxy”. See the “circular correspondence bias” discussion below.</ref> Often ''all'' of these things. (Let us call them “[[Criminal propensity|criminal propensities]]”.)


The absence of ''any'' of criminal propensities in a suspect’s makeup should ''reduce'' the “prior probability” of foul play by that suspect. As we will see, “circular correspondence bias” can take such a ''lack'' of criminal propensity and somehow invert it into confirmation.
The absence of ''any'' of criminal propensities in a suspect’s makeup should ''reduce'' the “prior probability” of foul play by that suspect. As we will see, “circular correspondence bias” can take such a ''lack'' of criminal propensity and somehow invert it into confirmation.
Line 84: Line 80:


Evidence supporting the intuition that “a sane mother is most unlikely to brutally murder her own nine-week-old child at all, let alone with an improvised weapon and without warning or provocation” was not before the court. What evidence could there be of that? Somehow the jury was persuaded not just that she did murder her child, but that there was no plausible alternative explanation for the child’s disappearance. This was largely thanks to the strange collection of cognitive biases to which the prosecution had succumbed.
Evidence supporting the intuition that “a sane mother is most unlikely to brutally murder her own nine-week-old child at all, let alone with an improvised weapon and without warning or provocation” was not before the court. What evidence could there be of that? Somehow the jury was persuaded not just that she did murder her child, but that there was no plausible alternative explanation for the child’s disappearance. This was largely thanks to the strange collection of cognitive biases to which the prosecution had succumbed.
==The three phases of tunnel vision==
 
{{drop|S|o what is}} “prosecutor’s tunnel vision”, then and how does it come about?<ref>[[JC]] draws upon ''{{plainlink|https://media.law.wisc.edu/m/2fjzd/findley_scott_ssrn_copy-1.pdf|The Multiple Dimensions of Tunnel Vision in Criminal Cases}}'' by Keith Findley and Michael Scott in the Wisconsin Law Review (2006).</ref>  
{{drop|S|o what is}} “prosecutor’s tunnel vision”, then and how does it come about?<ref>[[JC]] draws upon ''{{plainlink|https://media.law.wisc.edu/m/2fjzd/findley_scott_ssrn_copy-1.pdf|The Multiple Dimensions of Tunnel Vision in Criminal Cases}}'' by Keith Findley and Michael Scott in the Wisconsin Law Review (2006).</ref>  
It is a sort of “emotional conviction” to an (as-yet) unproven explanation. We become personally invested in a narrative; the consequences — and personal costs — of rejecting the conviction are great, and grow the more we commit to the position.
It is a sort of “emotional conviction” to an (as-yet) unproven explanation. We become personally invested in a narrative; the consequences — and personal costs — of rejecting the conviction are great, and grow the more we commit to the position.
Line 120: Line 116:
This kind of expert overreach is germane for the “[[healthcare serial murder]]” cases. Human biology is, in the technical sense, ''[[complex]]''. There is much about it that even experts do not yet, and may never, know. Patients may have conditions that are never tested for, and therefore never detected, before or after death. They may have conditions ''as yet unknown to medical science'', in which case they would not have been revealed by established tests in any case.
This kind of expert overreach is germane for the “[[healthcare serial murder]]” cases. Human biology is, in the technical sense, ''[[complex]]''. There is much about it that even experts do not yet, and may never, know. Patients may have conditions that are never tested for, and therefore never detected, before or after death. They may have conditions ''as yet unknown to medical science'', in which case they would not have been revealed by established tests in any case.


====Low base rate====
====High improbability, whatever the explanation====
Furthermore, when we are dealing with ''extremely rare events'' any statistical analysis is vulnerable also to [[base rate neglect]]. “[[Healthcare serial murder]]ers” (whether real or imagined) are ''extremely'' rare. So, [[Q.E.D.]], are the “innocent” causes with which they may be confused: if the “plausible innocent cause of death” explanation is ''not'' rare, then there is little chance it will be confused with serial murder.  likelihood of a malicious explanation is therefore much lower</ref> —  As to which, see below.
When the allegation involves ''extremely improbable events'' — where there is ''no'' commonly experienced explanation — our natural human weakness for statistical reasoning is in play. [[Base rate neglect]] (see below) becomes a risk.  
 
Both possible explanations for Azaria Chamberlain’s disappearance (being snatched by a dingo and maternal infanticide) were ''extremely improbable''. What little data there was on the dingo abductions, suggested it was rare, but the data were not good. Not many people camp with infants in the Outback. Dingoes rarely get the chance to take them. There is a ''lot'' of evidence that maternal infanticide is ''extremely'' improbable: ''You'', dear reader, are evidence of that.
 
Now, had dingo attacks been commonly reported before Azaria’s disappearance the possibility of maternal infanticide may never have come up.
 
The “[[healthcare serial murder]]ers” cases typically have this same feature: both the crime (serial murder) ''and'' the “innocent alternative” (an unusual cluster of natural deaths coinciding with the attendance same nurse) are intrinsically improbable. Base rate neglect — healthcare serial murders remain vanishingly unlikely — is a real risk.


===Getting there===
===Getting there===
{{Drop|O|nce you are}} anchored, equipped with a hammer and have started wandering around the house looking for nails, there should still be scope to falsify your operating theory. But again, psychological biases can override the dispassionate application of cool logic.
{{Drop|O|nce you are}} equipped with a hammer and have started wandering around the house looking for nails, there should still be scope to falsify a bad operating theory. But again, psychological biases can override the dispassionate application of cool logic.


====Extrapolation and bad heuristics ====
====Hopeful extrapolation ====
[[Subject matter expert]]s are prone to ''extrapolate''. Humans are natural problem-solvers. We build models of the world by habit. It is easy to slip beyond our range of reliable experience and form theories about matters where we have little expertise. A statistician can give a compelling account of the Bayesian probabilities but the forensic question of ''who or what ''caused'' the excess toxins'' is a question of forensics, not statistics.  
We are natural riddle-solvers. We build models of the world by habit. It is easy to slip beyond our range of reliable experiences and form theories for which we have little expertise, especially where abundant circumstantial evidence is [[consistent with]] — able to be fitted to — our side of the argument, rather than dispositive of it.


The greater the expertise, the more “grooved” the expectations, the stronger the [[heuristic]], the greater the temptation to take shortcuts and ''presume'' this is “one of those” cases, relying on information that is most readily available or recent in their minds, rather than conducting a thorough investigation. This can lead to overestimating the importance of certain pieces of evidence
Litigation’s adversarial nature, in which advocates are meant to present their arguments in the best possible light, emphasising helpful facts and neglecting unhelpful ones, hardly helps. The underlying philosophy here is akin to Adam Smith’s “invisible hand”: from the interaction of opposed, self-interested advocates we expect the invisible hand of justice miraculously to emerge.


====Base rate neglect====
====Base rate neglect====
''Lawyers are not natural statisticians''. “Base rate neglect” — the “prosecutor’s fallacy” — is the natural tendency to ignore the “base rate” of a phenomenon in the population and instead focus on “individuating” information that pertains to the specific case in front of us.  
''Lawyers are not natural statisticians''. We should be careful with statistics especially when they concern extremely improbable events.  


We should be careful with statistics about extreme events.
“Base rate neglect” — the “prosecutor’s fallacy” — is the natural tendency to ignore the “base rate” of an outcome in a population and instead focus on “individuating” information about the specific case.  


Say a certain medical test gives a correct result 999 times out of 1,000 and only yields one “false positive” result in a thousand tests. It is tempting to assume that a positive result is for such a reliable test will be conclusive. But if the general prevalance in the population of the condition it is testing for — the “base rate” for the illness — is only one in 100,000 then for every single ''true'' positive result, we should still expect 100 ''false'' positives. Fully 99% of positive tests will be false. <ref>Assume you test 100,000 people. At a 99.9% accuracy rate, you will expect 99,900 correct results, and 100 false ones. But in that sample of 100,000 you would only expect 1 actual case of the condition. So if you are tested and receive a positive result, you have a 1 in 100 chance that it is a true result.</ref>
Linda is a single, middle-aged female philosophy graduate who is active in CND. If asked which description of Linda is more likely to be true, most people will choose “she is a bank teller who is active in the feminist movement” over “she is a bank teller” when plainly the former is a subset of the latter.<ref>{{Plainlink|https://www.psychologytoday.com/gb/blog/the-superhuman-mind/201611/linda-the-bank-teller-case-revisited|per Kahneman and Tversky}}. This work is not without its critics. JC wonders whether the same thing propels the commercial solicitor’s compulsion to over-description.</ref>


The same principle holds for criminal offending: if there is, say, a 1/342,000,000 chance that “so many suspicious deaths could have occurred with the same nurse on duty by sheer chance” and one nurse was in fact on duty for all those suspicious deaths, it may seem utterly damning. But this is to ignore the base rate of ''how many [[healthcare serial killer]]s are there likely to be in the world?''
Say a medical test is expected to give a correct result 999 times out of 1,000. We are tempted to assume, therefore, that any positive result will be pretty much conclusive. But if the general prevalence of the condition in the population — the “base rate”  — is just 1 in 100,000 then for every ''true'' positive result, we should expect 100 ''false'' ones. The probability that a positive test is accurate is only 1%. <ref>This sounds properly nuts but it is true. Assume you test 100,000 people. At a 99.9% accuracy rate, you will expect 99,900 correct results, and 100 false ones. But in that sample of 100,000 you would only expect 1 actual case of the condition. So if you are tested and receive a positive result, you have a 1 in 100 chance that it is a true result.</ref>


On a planet with eight billion people, for the odds of our nurse being a serial killer to be as high as 1/342,000,000 there would need to be ''twenty-three''. There do not appear to be that many hospital serial killers in the last generation.  
Take the [[healthcare serial murder]]er: say there is a 1 in 342 ''million'' chance that a nurse would be on duty for all suspicious deaths in a cluster at random. If one nurse ''was'' on duty for all those suspicious deaths, we might suppose it to be damning. The jury in the trial of Dutch nurse Lucia de Berk did: they convicted her of serial murder in 2010.  


Yet “one in three-hundred and forty-two million” was the figure that convicted Dutch nurse Lucia de Berk of serial murder. Had this estimate been correct (it turned out to be a wild underestimate) sheer chance was ''still'' a likelier explanation.<ref>It turned out the statistics were in any case wrong: the probability of her shift patterns coinciding by chance was reassessed as being more like one in twenty-five. De Berk’s acquittal was overturned in 2010.</ref>
But this is to ignore the base rate. With no other prior information suggesting her guilt (in de Berk’s case, there was none) what is the probability that a given individual is a [[healthcare serial murder]]er?
 
For those odds to be ''as likely'' — not more — as random chance — there would need to be ''twenty-three'' [[healthcare serial killers]] operating at a given moment. There do not appear to be that many in the world, let alone in healthcare environments.
 
Had that probability been correct (it turned out to be a wild underestimate) sheer chance was ''still'' a likelier explanation that Lucia de Berk was a serial murderer. (The probability of her shift patterns coinciding by chance was subsequently reassessed as being more like 1 in 25. De Berk’s conviction was overturned in 2010.


====Confirmation bias====
====Confirmation bias====
All observation is theory-dependent: scientists must first have a theory before they can gather evidence to test it: otherwise, how do they know what evidence to look for?  
All observation is theory-dependent: scientists must first have a theory before they can test it: otherwise, how do they know what they are looking for?  
 
Having picked a bad [[heuristic]], we tend to seek out, interpret, and best remember information that confirms it. We may overweight evidence that supports our theory and disregard or minimise anything that contradicts it.


There are two phases of [[confirmation bias]]: the first is theory formation. Thanks to subject matter expertise extrapolation, like the man with a hammer you are primed to see nails. Suspicious that strangely-behaving Lindy Chamberlain might have something to hide, Police searched the family’s possessions, tent and vehicle, eventually finding something forensic tests suggested was consistent with “[[fetal blood]]” splattered up inside the footwell of her husband’s car. (Much later the substance was retested. It turned out to be sound-deadening material containing iron oxide.) This set the police in a direction. and before long they had constructed an elaborate theory of how Lindy Chamberlain had committed murder.
Having committed to a bad theory, we tend to seek out, interpret, and best remember information that confirms it. We may over-weight supporting evidence and disregard contradicting evidence. This is what the adversarial system expects.  


Evidence when taken in the abstract, would tend to “be [[consistent with]]” innocence can, with confirmation bias, be presented as damning. The innocent defendant who maintains innocence notwithstanding his wrongful conviction is presented as callous, lacking remorse and may expect a harsher sentence and less chance of parole. What is a wrongful convict meant to do?
This is confirmation bias. It has two phases. The first applies in “theory formation”: the man with a hammer is primed to see nails.


An innocent defendant who, during a period of misfortune for which she was not responsible, went out with her friends, posting pictures of herself as a vivacious young woman who enjoyed socialising — is portrayed as thereby a callous, calculating demon. But these are exactly the behaviours to be expected from an innocent person. This ought to point ''away'' from the defendant, though with confirmation bias it points towards her.
Suspicious that strangely-behaving [[Lindy Chamberlain]] might have something to hide, Police searched her possessions, tent and vehicle, eventually finding what they believed to be “[[fetal blood]]” splashed under the dashboard of her husband’s car.<ref>Much later the substance was retested. It turned out to be sound-deadening material containing iron oxide.</ref> This set the police in a direction. Before long they had constructed an elaborate theory of how [[Lindy Chamberlain]] had committed murder.


====Selective information processing====
The second aspect is in theory ''corroboration''. Here evidence which, when taken in the abstract, would tend to “be [[consistent with]]” innocence can, with confirmation bias, be presented as damning.
Focusing on certain pieces of evidence while ignoring others. Prosecutors might only present evidence that strengthens their case and neglect exculpatory evidence that could help the defence. Peter Ellis’ prosecutors interviewed one hundred and eighteen children. Many denied all suggestion of abuse. Some gave plainly preposterous accounts of what went on. They were not called to give evidence. Their statements do not appear to have been fully disclosed to the defence.


====Groupthink====
We would ''expect'' an innocent mother to behave “oddly” in the aftermath of her infant daughter’s disappearance. We would ''expect'' an innocent nurse to lead a healthy social life. We would ''expect'' an innocent defendant not to admit his crimes or express remorse for his actions.
Thinking or making decisions as a group in a way that discourages creativity or individual responsibility: Prosecutors might conform to the prevailing opinion within their office, stifling dissenting views and critical analysis of the case. See also Dan Davies’ idea of  the “[[accountability sink]]s” within an organisation. The {{poh}} is perhaps the archetypal example of groupthink amongst a group of prosecuting lawyers.


====Reductionism====
But these are exactly the behaviours one might expect of the innocent: things that point ''away'' from the defendant’s guilt appear, with confirmation bias, to point ''towards'' it.
Immersing into technical details that, by themselves and shorn of all context, seem to lead to one conclusion — especially one to which you are already [[anchor]]ed — notwithstanding the wider picture making the hypothesis unlikely. Especially in cases with no direct evidence, there is a great risk of this.


===Staying there===
===Staying there===
Once we are in a tunnel, there are cognitive biases that prevent us from finding our way out.
{{drop|O|nce we are}} in the tunnel, there are cognitive biases that prevent us from finding our way out.
==== Hindsight bias and the reiteration effect ====
==== Hindsight bias and the reiteration effect ====
In hindsight, people tend to think an eventual outcome was inevitable or more likely than they might have before it happened. “What is the chance that that nice woman you met at the campsite just now will, in three hours, brutally murder her own nine-week-old infant?” versus “Given that this woman’s nine-week-old infant has disappeared from the campsite, and the police suspect her of foul play, what is the change that she brutally murdered her own child?”
In hindsight, people tend to think an eventual outcome was inevitable or more likely than they might have before it happened. “What is the chance that that nice woman you met at the campsite just now will, in three hours, brutally murder her own nine-week-old infant?” versus “Given that this woman’s nine-week-old infant has disappeared from the campsite, and the police suspect her of foul play, what is the chance that she brutally murdered her own child?”


Through “hindsight bias” we project knowledge of actual outcomes onto our knowledge of the observed behaviour in the past, without realising that the perception of the past has been tainted by the subsequent information.
Through “hindsight bias” we project onto ''outcomes'' our knowledge of the observed behaviour in the past, without realising that the perception of the past has been tainted by our knowledge of the outcome.


Once a person becomes a prime suspect hindsight bias suggests that, upon reflection, the suspect was the inevitable and likely suspect from the beginning. Evidence is malleable in light of this “realisation”.  
Once a person becomes a prime suspect, hindsight bias suggests that, upon reflection, she was the likely suspect from the beginning. Evidence is malleable in light of this “realisation”.  


There is also a “reiteration” effect. Our confidence in a theory increases the more we hear it, independent of its validity. The longer that police, prosecutors and witnesses live with the conclusion of guilt, the more invested in it they are, the more entrenched their conclusion becomes, and the more obvious it appears that all evidence pointed that way from the very beginning. It is increasingly difficult for police and prosecutors to consider alternative perpetrators or theories of a crime.
There is also a “reiteration” effect. Our confidence in a theory increases the more we hear it, independent of its validity. The longer that police, prosecutors and witnesses live with the conclusion of guilt, the more they become invested in it.  Their conclusions become entrenched, and it appears obvious that all evidence pointed to the defendant from the outset. Prosecutors and defenders find it increasingly hard to consider alternative theories of situation.


====Circular correspondence bias: Randle McMurphy’s dilemma====
====Randle McMurphy’s dilemma====
Correspondence bias, or the “fundamental attribution error” attributes observed behaviour to personality, or malice, without considering more probable situational factors. This you can, but few enough do, avoid with [[Hanlon’s razor]]:
“Correspondence bias”, or the “fundamental attribution error” automatically attributes observed behaviour to malice without considering other explanations. Our favourite, per [[Hanlon’s razor]], is stupidity:
{{quote|{{hanlon’s razor}}}}
{{quote|{{hanlon’s razor}}}}
A strangely prevalent form of reasoning that combines it with circular logic as follows:
But it applies just as well to innocent explanations, and alternative guilty ones. It works like this:
{{L1}}There is ''weak'' [[circumstantial evidence]] that X committed a crime.<li>
=====The circular correspondence bias model=====
The (highly unusual) traits of people who commit that sort of (highly unusual) crime are attributed to X (this is a “fundamental attribution error”).<li>
A strangely prevalent form of ''circular'' correspondence bias works as follows:
X’s newly-attributed traits are cited as evidence corroborating the existing weak circumstantial evidence (this is a circularity). X is now a “sociopath”, “narcissist”, “attention-seeker”, or even “Munchausen’s syndrome by proxy sufferer”, so is more likely to have committed the highly unusual crime.<li>
 
Other aspects of X’s behaviour which, but for the allegation, would be normal, now appear to verify the trait (this is a further circularity).<li>
There is ''weak'' [[circumstantial evidence]] that X committed a crime.
X is convicted on the compelling evidence of the opportunity and possession of that highly unusual trait.</ol>
 
You might recognise this as the plot driver of Ken Kesey’s ''One Flew Over the Cuckoo’s Nest''.
The (highly unusual) traits of people who commit that sort of (highly unusual) crime are attributed to X (this is a “fundamental attribution error”).
=====Bob the doctor=====
So, say there is an unusual cluster of deaths in the geriatric ward of a hospital. It coincides with the shift pattern of one doctor, Bob, raising a weak and easily rebuttable presumption of foul play on Bob’s part.  


Before this statistical correlation, Bob was not under suspicion. Nor was his behaviour unusual.
X’s newly-attributed traits are cited as evidence corroborating the existing weak circumstantial evidence. This is circular.
 
X is now characterised as a “sociopath”, “narcissist”, “attention-seeker”, or even “Munchausen’s syndrome by proxy sufferer”, so is more likely to have committed the highly unusual crime.
 
Other aspects of X’s behaviour which, but for the allegation, would be normal, now appear to verify the trait. (He socialises normally. He sends condolences. He works extra shifts.) This is, of course, is also circular.
 
X is convicted on the compelling evidence of the opportunity and his highly unusual and suspicious behaviour.
 
You might recognise this as the plot driver of Ken Kesey’s ''One Flew Over the Cuckoo’s Nest''.
 
=====A worked example: Doctor Bob=====
So, say there is an unusual cluster of deaths in the geriatric ward of a given hospital. All deaths —eight of them, over six months — coincide with the shift pattern of one doctor, Bob.
 
This raises a weak but, Bob thinks, easily rebuttable presumption of foul play on Bob’s part. (“Since I didn’t do it, there will be no direct evidence and no strong circumstantial evidence specifically implicating me.”)
 
Before this statistical correlation, Bob was not under suspicion. Nor was his behaviour unusual: he was a normal young, diligent doctor with an active social life. Nor does his behaviour change ''after'' the cluster of deaths.


But the foul play — if that’s what it is — is ''horrific'': someone is systematically murdering little old ladies. Only a psychopath with a stone-cold heart and a narcissistic personality disorder could do that. If it is Bob, he must be a psychopath with a narcissistic personality disorder. This logic is already circular, but the circle is big and conditional enough not to be obvious.
But the foul play — if that’s what it is — is ''horrific'': someone is systematically murdering little old ladies. Only a psychopath with a stone-cold heart and a narcissistic personality disorder could do that. If it is Bob, he must be a psychopath with a narcissistic personality disorder. This logic is already circular, but the circle is big and conditional enough not to be obvious.
Line 195: Line 209:


Now: what could be more indicative of a stone-cold, narcissistic psychopath who has just murdered someone than a fastidiously tidy person — hence, no evidence, right? — who sent a card to the victim’s wife before going out drinking and dancing with his friends? The net is closing in.
Now: what could be more indicative of a stone-cold, narcissistic psychopath who has just murdered someone than a fastidiously tidy person — hence, no evidence, right? — who sent a card to the victim’s wife before going out drinking and dancing with his friends? The net is closing in.
=====A second worked example: Nurse Lucy=====
How an odd series of coincidences might wind up with a diligent neonatal nurse being handed seven whole-of-life sentences:
There has been an unusually large number of explained collapses. (The thing about random events is that they are clumpy and not perfectly smooth and predictable, but let’s park that).
One explanation for this cluster is that someone is harming babies. (Another is that it could be one or a combination of any number of factors, including a literally infinite set of things we don’t even know about, that could also possibly have prompted this cluster, or indeed that, per step 1, the cluster is simply a truly random variation that was not prompted by anything in particular, but let's park those too)


==Antidotes==
(''hospital administrators'') Hey, there's a nurse who was on duty for a *lot* of these collapses. Most of them, in fact.
{{quote|
 
Q: How many psychiatrists does it take to change a light bulb? <br>
So it is not beyond possibility that someone *is* deliberately harming babies! Whoah. That’s serious. Let us be on the safe side. Let’s call the police!
A: Just one; but the light bulb really has to want to change.}}
 
Some strategies to counteract the effect, but the predominant one is to ''want'' to keep an open mind.
Please, Mr Policeman, sir, we have had a spike in unexplained deaths in our ICU and we can’t rule out foul play.
====Hanlon’s, and Otto’s razor====
 
{{quote|“Do not attribute to ''malice'' things that can just as well be explained by ''stupidity''.
(''Mr Police''): But this is serious! What makes you think there is foul play?
:—''[[Hanlon’s razor]]''}}
 
Don’t assume malice where stupidity will do; likewise, per Otto’s razor, don’t attribute to ''virtue'' something that could equally be attributed to self-interest; or to ''skill'' something that could equally be attributed to dumb ''luck''.
Well, without wishing to tell tales there ''is'' this nurse who keeps turning up like a bad penny. So there is that. But we don’t know: it may be a coincidence.
 
(Mr Police): Well, let’s have a look at the data! Only a limited number of people have the plausible opportunity to harm babies deliberately — Nurses and doctors in the ICU, pretty much so we should be able to rule them all out pretty quickly right? Let's have a look at the shift rota to see who was on duty.
 
(''Mr Police''). Well lookee here: there is one, just one nurse, who was on duty for pretty much<ref>Rather embarrassingly, it turns out that the shift rota was erroneous and this nurse was ''not'' on duty on every single case, but hey ho.
</ref> every single one of those collapses. Ladies and gentlemen, I think we have our answer.
 
(''Hospital administrators aside''): I always knew she was a wrong ’un.

Latest revision as of 21:52, 16 September 2024

Prosecutor’s tunnel vision
/ˈprɒsɪkjuːtəz/ /ˈtʌnᵊl/ /ˈvɪʒᵊn/ (n.)
The collection of biases and cognitive gin-traps that can lead prosecutors — those who “prosecute” a particular theory of the world — to stick with it, however starkly it may vary from available evidence and common sense.

So named because it is often literal prosecutors, of crimes, who suffer from it. This kind of tunnel vision has led to notorious miscarriages of justice where innocent people come to be convicted notwithstanding clear and plausible alternative explanations for their ostensible “crimes”.

By tunnel vision, we mean that “compendium of common heuristics and logical fallacies,” to which we are all susceptible, that lead actors in the criminal justice system to “focus on a suspect, select and filter the evidence that will “build a case” for conviction, while ignoring or suppressing evidence that points away from guilt.” This process leads investigators, prosecutors, judges, and defence lawyers alike to focus on a particular conclusion and then filter all evidence in a case through the lens provided by that conclusion.

The Multiple Dimensions of Tunnel Vision in Criminal Cases by Keith Findley and Michael Scott (2006)

The same tunnel vision also motivates ideologies, conspiracies and management philosophy: 360-degree performance appraisals, outsourcing, the war on drugs; the worldwide AML military-industrial complex: are all cases where those “prosecuting” the theory stick with it even though the weight of evidence suggests it does not work and may even be counterproductive.

The “prosecutor’s tunnel” begins with clear but simplistic — misleading — models of a messy world. Humans have a weakness for these: we are pattern-matching, puzzle-solving animals. We are drawn to neatness. We resile from intractability as it indicates weakness: that our frail human intellect has been defeated by the ineffable natural order of things.

An elegant hypothesis

Sometimes the sheer elegance of a prosecutor’s case can crowd out common sense and the basic intuition that this cannot be right.

We have built our legal institutions to be vulnerable to this kind of crowding out. Criminal law proceeds upon data and the weight of evidence but disallows “intuition”. Hence, there is an asymmetry: evidence is better at saying what did happen than what did not. This is especially so where there is no direct evidence that the defendant actually did what she is accused of.

Circumstantial evidence does not directly implicate a defendant but is consistent with the prosecution theory. It accumulates: if there is enough of it, and none points away from the defendant, it can tell us something. But, correlation and causation: evidence that is “consistent with” a prosecution theory does not prove it: that JC owns a bicycle is consistent with his competing in the Tour de France; it does not make him any more likely to do it. Evidence can look more meaningful than it is. This is where intuition ought to be able to help us.

As it is, intuition’s role is relegated to underpinning the presumption of innocence. A prosecutor must prove guilt; the accused need not prove anything: she cannot be expected to explain what happened for the simple reason that and innocent person should have no better idea about it than anyone else. The jury, we hope, leans on its intuition when conjuring doubts.

Experience tells us otherwise. In what follows, JC takes three notorious cases from the antipodes to see what can happen when, with no direct evidence, those arguing the case become afflicted with tunnel vision, and intuition and common sense are relegated behind “data” and circumstantial evidence. Then we will look at what causes this condition.

Narrative biases

These cases illustrate the problem of relying on circumstantial evidence: with no independent direct evidence, one tends to start with a hypothesis and fit whatever secondary and forensic evidence you have into it, discarding whatever does not fit. This is the classic tunnel vision scenario. It can afflict those who would defend suspects just as firmly as those who prosecute them.

All kinds of theories circulated owing to the Chamberlains’ unusual religious beliefs and “odd behaviour” in the aftermath of Azaria’s disappearance. But devout Christianity is hardly a solid prior indicating a tendency to murder. Nor is “odd behaviour” in the aftermath of a mother’s most extreme psychological trauma. Who would not behave oddly in those circumstances?

That anyone could bring themselves to cold-bloodedly murder a nine-week-old baby is hard to imagine. Statistically, it is highly improbable. That the child’s own mother would is, in the absence of compelling evidence, preposterous. To even start with this theory you must surely have compelling grounds to believe it over all other possibilities — if not credible eye-witness evidence, then a documented history of violence, behavioural volatility or psychiatric illness grave enough to overthrow the strong human instinct to protect vulnerable infants. Lindy Chamberlain had no such history.

If there is any plausible alternative explanation for the baby’s disappearance, there must have been a reasonable doubt. It need not be more probable than the prosecution case: just not out of the question. Lindy Chamberlain provided one: a dingo snatching the child might have been unprecedented, but it was possible. There were dingoes in the area. They are predators. They are strong enough to carry away a human infant. A dingo was no less likely than a new mother noiselessly murdering her own infant just yards from a group of independent witnesses. That ought to have been the end of it.

Likewise, what Peter Ellis was alleged to have done is extraordinarily improbable. There are few documented cases of ritualistic abuse on that scale anywhere in the world. There are none in New Zealand. For such a thing to have happened without any prior evidence of such behaviour, with no adult witnesses, no one noticing the absent children and for none of the children to bear any trace of their supposed injuries makes it even less likely.

And there was a plausible alternative: nothing happened at all. All that was required for that to be true was for preschool children, perhaps at the prompt of interviewers already in the grip of prosecutor’s tunnel vision, to make things up. By comparison with “untraceable, unwitnessed, wide-scale ritual satanic abuse”, “children exercising their imaginations to please adults” is not improbable.

It is different for David Bain. While it is true that familicide is extremely rare and, therefore, absent prior evidence, highly improbable, there is no question that the Bain family were murdered. The only question was by whom.

On David’s own theory, only two people could have done it: his father and himself. It was, therefore, definitely familicide: the abstract improbability of that explanation is therefore beside the point. The probability that David was responsible is therefore greatly higher: before considering any further evidence there is a 50% chance he was responsible.

And a lot of the further evidence pointed in his direction. To not be the murderer, on his own evidence, David would have been extremely unlucky — forgetting to turn on the light, inadvertently disposing of exculpatory evidence, having incriminating injuries he could not explain — while no such evidence pointed to Robin. David’s defenders had their own tunnel vision, focusing narrowly on the provenance of each piece of incriminating evidence, identifying formal shortcomings in its value as evidence: questioning the manner of its collection, the chain of custody, raising possibilities of innocent explanations without evidence to support that alternative, and disregarding the wider context of the whole case.

Now, David Bain was acquitted of all charges. On the evidence, the jury could not rule out the possibility that Robin Bain was responsible. Not being satisfied beyond reasonable doubt that David was the perpetrator, he was correctly acquitted at law. But it remains likely that David was the perpetrator.[1] As a piece of judicial procedure, the comparison between Bain’s case and those of Ellis and Chamberlain is stark.

Tunnel vision and circumstantial evidence

Where there is reliable direct evidence — eyewitnesses, recordings, and causative links between a suspect and the allegation — there is little need for inference; the evidence speaks for itself. But cases comprised predominantly of circumstantial evidence — that therefore depend on inferential reasoning — are vulnerable to tunnel vision because the complex of cognitive biases that make up prosecutor’s tunnel vision affect the process of inference.

Upstanding citizen turns master criminal. Does well.

Prosecutor’s tunnel vision cases often involve hitherto law-abiding citizens suddenly committing fiendish crimes without warning, explanation or motive.

Now JC is, ahem, told that committing violent crime without leaving any incriminating evidence is extremely hard. Especially in a controlled environment like an infants’ daycare centre or a hospital.

To be sure, serial criminals can operate in these environments but they will need to be good: meticulous in their preparation and method. Over time, they will hone their techniques and perfect a modus operandi, acquiring a ghoulish sort of expertise in murder: killing patients in a closely monitored, controlled environment populated by trained experts hardly lends itself to opportunistic, freestyle offending. Hospitals, in particular, overflow with specialists who can detect subtle clues that ordinary laypeople — and burgeoning criminals learning their craft — have no idea about.

As with any complicated discipline, one learns as one goes. We should not, therefore, expect “beginners” to perform like master jewel thieves, slipping in and out, striking in the dark and leaving no trace. They will blunder. They will be careless. They will leave evidence. They will slip up, leave giveaways and clumsily trigger red flags. From new criminals, we should expect “smoking guns”.

So if a strange confluence of events is accompanied by no smoking pistol, this too has some prior probability value. It does not exclude the possibility of foul play, but it does make it less likely.

People do not often flip, overnight and without warning, from conscientious citizens to compulsive criminals. If they did, we would notice it.[2] When hitherto law-abiding people do slide into criminality, there is generally motivation, a history of antisocial behaviour, identifiable psychological trauma, drug dependency, observed personality change over time or diagnosed mental illness.[3] Often all of these things. (Let us call them “criminal propensities”.)

The absence of any of criminal propensities in a suspect’s makeup should reduce the “prior probability” of foul play by that suspect. As we will see, “circular correspondence bias” can take such a lack of criminal propensity and somehow invert it into confirmation.

Where a crime has certainly been committed, this goes only to who the perpetrator is. There may (as in David Bain’s case) be only a small universe of credible suspects. If all “possible suspects” have the same lack of criminal propensity, it will count for little. But if the universe of “potential suspects” is large — or if it is plausible that no crime was committed at all — an individual’s lack of any criminal propensity should tell us something “circumstantial”.

Neither Lindy Chamberlain nor Peter Ellis had any criminal propensity and both cases there was a plausible alternative explanation. For David Bain it was different.

Burden and standard of proof

The burden of proof is a different thing to the standard of proof. The burden is who has to prove their case: this falls squarely on the prosecution. The defence is not required to prove anything, least of all the accused’s innocence.

But there is tension between that crystalline legal theory and the practical reality: it is in the defendant’s interest that someone casts doubt into jurors’ minds. Since the Crown plainly won’t be doing that, the defence must either rely on jurors to confect plausible doubts by themselves, or it must plant some doubts there. It is a brave defence counsel indeed who puts her client’s future in the hands of a jury’s imagination and capacity for creative thought.

All the same, the prosecution’s standard of proof — what it must do to discharge its burden of proof — is, in theory, extremely high. Courts have dumbed down the time-honoured phrase beyond reasonable doubt: these days, juries are directed to convict only if they are “sure”. This is meant to mean the same thing, but not everyone is persuaded that is how juries understand it.[4]

There is some reason to think that juries start with an ad hoc presumption that any defendant put before them is somewhat likely to be guilty: if the police were competent and acted in good faith, why else would the defendant be in the dock?

So where there is only tendentious data supporting a defendant’s guilt but a total lack of “data” supporting her innocence — what evidence could there be that you did not do something that did not happen? — there are grounds for confusion here, and there is good evidence that juries do indeed get confused.

Lindy Chamberlain was convicted of her own daughter’s murder, with a pair of blunt scissors, on the circumstantial evidence of what looked like blood sprays in the footwell of the family car.[5]

Evidence supporting the intuition that “a sane mother is most unlikely to brutally murder her own nine-week-old child at all, let alone with an improvised weapon and without warning or provocation” was not before the court. What evidence could there be of that? Somehow the jury was persuaded not just that she did murder her child, but that there was no plausible alternative explanation for the child’s disappearance. This was largely thanks to the strange collection of cognitive biases to which the prosecution had succumbed.

So what is “prosecutor’s tunnel vision”, then and how does it come about?[6] It is a sort of “emotional conviction” to an (as-yet) unproven explanation. We become personally invested in a narrative; the consequences — and personal costs — of rejecting the conviction are great, and grow the more we commit to the position.

Tunnel vision has three phases: first, there must be enabling background conditions that make us vulnerable to tunnel vision; second, there are pathways into a given tunnel; third, there are cognitive biases that keep us there.

The lessons are two-fold:

We are not as rational as we like to think and
Data is never the whole story.

Background conditions

Certain dispositions, biases and miscellaneous psychological tics come together to create the conditions for tunnel vision to swamp an earnestly-held narrative:

Mainly circumstantial evidence

In that tunnel vision is a collection of cognitive biases infecting how we draw inferences, it usually arises where there is no direct evidence of wrongdoing. Where there are reliable eyewitnesses there is little need to infer what happened: someone saw it. Where there are no witnesses and the case depends on circumstantial evidence — even more so where it isn’t clear there was a crime at all — the jury must infer what happened from purely circumstantial evidence.

There was no clear evidence Azaria Chamberlain was dead, let alone murdered: she was simply missing. Had a reliable witness seen a dingo carrying her away — even Lindy Chamberlain did not claim to have seen that — there would be little scope for inference about the significance of the red-brown material spattered under the dashboard of the Chamberlain’s car. (The coroner’s damning report to the crown prosecutor in the Chamberlain case is a chilling example of tunnel vision.)

Information glut

The fact is, there are very few political, social, and especially personal problems that arise because of insufficient information. Nonetheless, as incomprehensible problems mount, as the concept of progress fades, as meaning itself becomes suspect, the Technopolist stands firm in believing that what the world needs is yet more information.[7]

The more circumstantial evidence there is, the more scope for inference, and the more fantastical narratives one can draw. If the alleged crime occurs in a tightly-controlled environment designed to generate technical data and specialist information, and where deep subject matter experts are on hand to observe and analyse that information in retrospect, conditions are ripe for tunnel vision.

This is exactly the scenario in which the healthcare serial murder cases arise. The alleged crimes, though rarely witnessed, take place within a carefully controlled environment[8]. Access is monitored by CCTV and controlled by swipecards and elaborate security systems. Detailed medical protocols govern and track the storage and dispensation of medicines. Sophisticated equipment — machines that go “ping” — monitor patients’ vital signs around the clock. Nurses, orderlies, consultants and doctors constantly mill around at all hours, doing ward rounds, checking in on patients and generally keeping an eye out for signs of trouble.

Though these systems seem incapable of capturing direct evidence of wrongdoing, they still generate a colossal amount of very “scientific” medical and digital data. This is capable of being analysed and framed to support — to be “consistent with” — any number of different and frequently contradictory inferences and theories of the case.

Expert overreach

To a man with a hammer, everything looks like a nail.

—Abraham Maslow

It is hardly news that existing knowledge and beliefs shape what we see and how we see it. Prosecutors (and defenders) hold pet theories about human behaviour just likeanyone else.

You don’t sign up for the army if you hope not to shoot a gun. Nor do you join the police hoping to never find crime, nor take a crown warrant if you don’t expect to prosecute. Those involved in prosecution are primed this way, as are we all: to be useful; for their roles to be important and to make a difference.

This is no less so for expert witnesses. They are incentivised to support the cases on which they are engaged.[9] Yet subject matter experts can overestimate their ability to analyse and judge subtle problems, especially those in fields adjacent to, but not directly within, their expertise. They may over-weight the overall significance of matters that do fall within their expertise against those that do not. They are less likely to consider alternative models, explanations, theories or evidence that de-emphasise their expertise, let alone theories of the case that contradict it.

This kind of expert overreach is germane for the “healthcare serial murder” cases. Human biology is, in the technical sense, complex. There is much about it that even experts do not yet, and may never, know. Patients may have conditions that are never tested for, and therefore never detected, before or after death. They may have conditions as yet unknown to medical science, in which case they would not have been revealed by established tests in any case.

High improbability, whatever the explanation

When the allegation involves extremely improbable events — where there is no commonly experienced explanation — our natural human weakness for statistical reasoning is in play. Base rate neglect (see below) becomes a risk.

Both possible explanations for Azaria Chamberlain’s disappearance (being snatched by a dingo and maternal infanticide) were extremely improbable. What little data there was on the dingo abductions, suggested it was rare, but the data were not good. Not many people camp with infants in the Outback. Dingoes rarely get the chance to take them. There is a lot of evidence that maternal infanticide is extremely improbable: You, dear reader, are evidence of that.

Now, had dingo attacks been commonly reported before Azaria’s disappearance the possibility of maternal infanticide may never have come up.

The “healthcare serial murderers” cases typically have this same feature: both the crime (serial murder) and the “innocent alternative” (an unusual cluster of natural deaths coinciding with the attendance same nurse) are intrinsically improbable. Base rate neglect — healthcare serial murders remain vanishingly unlikely — is a real risk.

Getting there

Once you are equipped with a hammer and have started wandering around the house looking for nails, there should still be scope to falsify a bad operating theory. But again, psychological biases can override the dispassionate application of cool logic.

Hopeful extrapolation

We are natural riddle-solvers. We build models of the world by habit. It is easy to slip beyond our range of reliable experiences and form theories for which we have little expertise, especially where abundant circumstantial evidence is consistent with — able to be fitted to — our side of the argument, rather than dispositive of it.

Litigation’s adversarial nature, in which advocates are meant to present their arguments in the best possible light, emphasising helpful facts and neglecting unhelpful ones, hardly helps. The underlying philosophy here is akin to Adam Smith’s “invisible hand”: from the interaction of opposed, self-interested advocates we expect the invisible hand of justice miraculously to emerge.

Base rate neglect

Lawyers are not natural statisticians. We should be careful with statistics especially when they concern extremely improbable events.

“Base rate neglect” — the “prosecutor’s fallacy” — is the natural tendency to ignore the “base rate” of an outcome in a population and instead focus on “individuating” information about the specific case.

Linda is a single, middle-aged female philosophy graduate who is active in CND. If asked which description of Linda is more likely to be true, most people will choose “she is a bank teller who is active in the feminist movement” over “she is a bank teller” when plainly the former is a subset of the latter.[10]

Say a medical test is expected to give a correct result 999 times out of 1,000. We are tempted to assume, therefore, that any positive result will be pretty much conclusive. But if the general prevalence of the condition in the population — the “base rate” — is just 1 in 100,000 then for every true positive result, we should expect 100 false ones. The probability that a positive test is accurate is only 1%. [11]

Take the healthcare serial murderer: say there is a 1 in 342 million chance that a nurse would be on duty for all suspicious deaths in a cluster at random. If one nurse was on duty for all those suspicious deaths, we might suppose it to be damning. The jury in the trial of Dutch nurse Lucia de Berk did: they convicted her of serial murder in 2010.

But this is to ignore the base rate. With no other prior information suggesting her guilt (in de Berk’s case, there was none) what is the probability that a given individual is a healthcare serial murderer?

For those odds to be as likely — not more — as random chance — there would need to be twenty-three healthcare serial killers operating at a given moment. There do not appear to be that many in the world, let alone in healthcare environments.

Had that probability been correct (it turned out to be a wild underestimate) sheer chance was still a likelier explanation that Lucia de Berk was a serial murderer. (The probability of her shift patterns coinciding by chance was subsequently reassessed as being more like 1 in 25. De Berk’s conviction was overturned in 2010.

Confirmation bias

All observation is theory-dependent: scientists must first have a theory before they can test it: otherwise, how do they know what they are looking for?

Having committed to a bad theory, we tend to seek out, interpret, and best remember information that confirms it. We may over-weight supporting evidence and disregard contradicting evidence. This is what the adversarial system expects.

This is confirmation bias. It has two phases. The first applies in “theory formation”: the man with a hammer is primed to see nails.

Suspicious that strangely-behaving Lindy Chamberlain might have something to hide, Police searched her possessions, tent and vehicle, eventually finding what they believed to be “fetal blood” splashed under the dashboard of her husband’s car.[12] This set the police in a direction. Before long they had constructed an elaborate theory of how Lindy Chamberlain had committed murder.

The second aspect is in theory corroboration. Here evidence which, when taken in the abstract, would tend to “be consistent with” innocence can, with confirmation bias, be presented as damning.

We would expect an innocent mother to behave “oddly” in the aftermath of her infant daughter’s disappearance. We would expect an innocent nurse to lead a healthy social life. We would expect an innocent defendant not to admit his crimes or express remorse for his actions.

But these are exactly the behaviours one might expect of the innocent: things that point away from the defendant’s guilt appear, with confirmation bias, to point towards it.

Staying there

Once we are in the tunnel, there are cognitive biases that prevent us from finding our way out.

Hindsight bias and the reiteration effect

In hindsight, people tend to think an eventual outcome was inevitable or more likely than they might have before it happened. “What is the chance that that nice woman you met at the campsite just now will, in three hours, brutally murder her own nine-week-old infant?” versus “Given that this woman’s nine-week-old infant has disappeared from the campsite, and the police suspect her of foul play, what is the chance that she brutally murdered her own child?”

Through “hindsight bias” we project onto outcomes our knowledge of the observed behaviour in the past, without realising that the perception of the past has been tainted by our knowledge of the outcome.

Once a person becomes a prime suspect, hindsight bias suggests that, upon reflection, she was the likely suspect from the beginning. Evidence is malleable in light of this “realisation”.

There is also a “reiteration” effect. Our confidence in a theory increases the more we hear it, independent of its validity. The longer that police, prosecutors and witnesses live with the conclusion of guilt, the more they become invested in it. Their conclusions become entrenched, and it appears obvious that all evidence pointed to the defendant from the outset. Prosecutors and defenders find it increasingly hard to consider alternative theories of situation.

Randle McMurphy’s dilemma

“Correspondence bias”, or the “fundamental attribution error” automatically attributes observed behaviour to malice without considering other explanations. Our favourite, per Hanlon’s razor, is stupidity:

“Do not attribute to malice things that can just as well be explained by stupidity.”

But it applies just as well to innocent explanations, and alternative guilty ones. It works like this:

The circular correspondence bias model

A strangely prevalent form of circular correspondence bias works as follows:

There is weak circumstantial evidence that X committed a crime.

The (highly unusual) traits of people who commit that sort of (highly unusual) crime are attributed to X (this is a “fundamental attribution error”).

X’s newly-attributed traits are cited as evidence corroborating the existing weak circumstantial evidence. This is circular.

X is now characterised as a “sociopath”, “narcissist”, “attention-seeker”, or even “Munchausen’s syndrome by proxy sufferer”, so is more likely to have committed the highly unusual crime.

Other aspects of X’s behaviour which, but for the allegation, would be normal, now appear to verify the trait. (He socialises normally. He sends condolences. He works extra shifts.) This is, of course, is also circular.

X is convicted on the compelling evidence of the opportunity and his highly unusual and suspicious behaviour.

You might recognise this as the plot driver of Ken Kesey’s One Flew Over the Cuckoo’s Nest.

A worked example: Doctor Bob

So, say there is an unusual cluster of deaths in the geriatric ward of a given hospital. All deaths —eight of them, over six months — coincide with the shift pattern of one doctor, Bob.

This raises a weak but, Bob thinks, easily rebuttable presumption of foul play on Bob’s part. (“Since I didn’t do it, there will be no direct evidence and no strong circumstantial evidence specifically implicating me.”)

Before this statistical correlation, Bob was not under suspicion. Nor was his behaviour unusual: he was a normal young, diligent doctor with an active social life. Nor does his behaviour change after the cluster of deaths.

But the foul play — if that’s what it is — is horrific: someone is systematically murdering little old ladies. Only a psychopath with a stone-cold heart and a narcissistic personality disorder could do that. If it is Bob, he must be a psychopath with a narcissistic personality disorder. This logic is already circular, but the circle is big and conditional enough not to be obvious.

We now start inspecting Bob’s behaviour. We find him to be meticulously tidy. He enjoys socialising, attending a regular salsa class on Wednesdays. He sent condolence cards to the families of several deceased patients.

Now: what could be more indicative of a stone-cold, narcissistic psychopath who has just murdered someone than a fastidiously tidy person — hence, no evidence, right? — who sent a card to the victim’s wife before going out drinking and dancing with his friends? The net is closing in.

A second worked example: Nurse Lucy

How an odd series of coincidences might wind up with a diligent neonatal nurse being handed seven whole-of-life sentences:

There has been an unusually large number of explained collapses. (The thing about random events is that they are clumpy and not perfectly smooth and predictable, but let’s park that).

One explanation for this cluster is that someone is harming babies. (Another is that it could be one or a combination of any number of factors, including a literally infinite set of things we don’t even know about, that could also possibly have prompted this cluster, or indeed that, per step 1, the cluster is simply a truly random variation that was not prompted by anything in particular, but let's park those too)

(hospital administrators) Hey, there's a nurse who was on duty for a *lot* of these collapses. Most of them, in fact.

So it is not beyond possibility that someone *is* deliberately harming babies! Whoah. That’s serious. Let us be on the safe side. Let’s call the police!

Please, Mr Policeman, sir, we have had a spike in unexplained deaths in our ICU and we can’t rule out foul play.

(Mr Police): But this is serious! What makes you think there is foul play?

Well, without wishing to tell tales there is this nurse who keeps turning up like a bad penny. So there is that. But we don’t know: it may be a coincidence.

(Mr Police): Well, let’s have a look at the data! Only a limited number of people have the plausible opportunity to harm babies deliberately — Nurses and doctors in the ICU, pretty much — so we should be able to rule them all out pretty quickly right? Let's have a look at the shift rota to see who was on duty.

(Mr Police). Well lookee here: there is one, just one nurse, who was on duty for pretty much[13] every single one of those collapses. Ladies and gentlemen, I think we have our answer.

(Hospital administrators aside): I always knew she was a wrong ’un.

  1. Christchurch Journalist Martin Van Beynen’s fantastic podcast Black Hands compellingly makes this case.
  2. They might snap into a sudden orgy extreme violence — but this plays out as desperate, meltdown mass murder, not calculated ongoing serial murder, and there is generally no doubt that it is murder and no shortage of direct evidence implicating the accused.
  3. Mental illnesses having a clear medical pathology, not suspiciously made-up ones out of ex- post facto symptoms like “Munchausen by proxy”. See the “circular correspondence bias” discussion below.
  4. New Law Journal: The Trouble With “Sure”
  5. In fairness the crown submitted expert forensic analysis entered that it was specifically infant blood, so you can hardly fault the jury here. You can fault the crown forensics team though: it turned out to be acoustic deadening spray and not blood of any kind!
  6. JC draws upon The Multiple Dimensions of Tunnel Vision in Criminal Cases by Keith Findley and Michael Scott in the Wisconsin Law Review (2006).
  7. Neil Postman, Technopoly: The Surrender of Culture to Technology, 1992.
  8. In the cases the JC has managed to track down, not one involves anyone seeing the accused commit any unequivocal act of harm
  9. There is some debate about expert witness conflicts of interest in criminal law circles but, given how public interest there is in the question, it has had little public exposure as yet: perhaps Post Office Horizon IT scandal and the Letby case will change that.
  10. per Kahneman and Tversky. This work is not without its critics. JC wonders whether the same thing propels the commercial solicitor’s compulsion to over-description.
  11. This sounds properly nuts but it is true. Assume you test 100,000 people. At a 99.9% accuracy rate, you will expect 99,900 correct results, and 100 false ones. But in that sample of 100,000 you would only expect 1 actual case of the condition. So if you are tested and receive a positive result, you have a 1 in 100 chance that it is a true result.
  12. Much later the substance was retested. It turned out to be sound-deadening material containing iron oxide.
  13. Rather embarrassingly, it turns out that the shift rota was erroneous and this nurse was not on duty on every single case, but hey ho.