Template:M intro crime tunnel vision: Difference between revisions

From The Jolly Contrarian
Jump to navigation Jump to search
No edit summary
 
(77 intermediate revisions by the same user not shown)
Line 1: Line 1:
{{d|Prosecutor’s tunnel vision|/ˈprɒsɪkjuːtəz/ /ˈtʌnᵊl/ /ˈvɪʒᵊn/|n}}
{{drop|T|he collection of}} biases and cognitive gin-traps that can lead ''prosecutors'' — those who “prosecute” a particular theory of the world — to stick with it, however starkly it may vary from available evidence and common sense.
So named because it is often ''literal'' prosecutors, of crimes, who suffer from it. This kind of tunnel vision has led to notorious miscarriages of justice where innocent people come to be convicted notwithstanding clear and plausible alternative explanations for their ostensible “crimes”.
{{quote|
{{quote|
By tunnel vision, we mean that “compendium of common [[Heuristic|heuristics]] and logical fallacies,” to which we are all susceptible, that lead actors in the criminal justice system to “focus on a suspect, select and filter the evidence that will ‘build a case’ for conviction, while ignoring or suppressing evidence that points away from guilt.” This process leads investigators, prosecutors, judges, and defence lawyers alike to focus on a particular conclusion and then filter all evidence in a case through the lens provided by that conclusion.
By tunnel vision, we mean that “compendium of common [[Heuristic|heuristics]] and logical fallacies,” to which we are all susceptible, that lead actors in the criminal justice system to “focus on a suspect, select and filter the evidence that will “build a case” for conviction, while ignoring or suppressing evidence that points away from guilt.” This process leads investigators, prosecutors, judges, and defence lawyers alike to focus on a particular conclusion and then filter all evidence in a case through the lens provided by that conclusion.
: — ''The Multiple Dimensions of Tunnel Vision in Criminal Cases'' by Keith Findley and Michael Scott (2006)}}{{Quote|Do not attribute to malice what can satisfactorily be explained by stupidity.
: — ''The Multiple Dimensions of Tunnel Vision in Criminal Cases'' by Keith Findley and Michael Scott (2006)}}
:—''[[Hanlon’s razor]]''}}{{quote|To a man with a hammer, everything looks like a nail.
 
:—Abraham Maslow}}
{{drop|T|he same tunnel}} vision also motivates ideologies, conspiracies and management philosophy: 360-degree [[performance appraisal]]s, [[outsourcing]], the war on drugs; the worldwide [[Anti-money laundering|AML]] military-industrial complex: are all cases where those “prosecuting” the theory stick with it even though the weight of evidence suggests it does not work and may even be counterproductive.  
{{d|Prosecutor’s tunnel vision|/ˈprɒsɪkjuːtəz/ /ˈtʌnᵊl/ /ˈvɪʒᵊn/|n}}
 
{{drop|T|he collection of}} [[bias]]es and cognitive gin-traps that can lead ''prosecutors'' — those who “prosecute” a particular theory of the world — to stick with it, however starkly it may vary from available evidence and common sense.
The “prosecutor’s tunnel” begins with clear but simplistic — ''misleading'' — models of a messy world. Humans have a weakness for these: we are pattern-matching, puzzle-solving animals. We are drawn to neatness. We resile from intractability as it indicates ''weakness'': that our frail human intellect has been defeated by the ineffable natural order of things.
===An elegant hypothesis===
{{drop|S|ometimes the sheer}} elegance of a prosecutor’s case can crowd out common sense and the basic intuition that ''this cannot be right''.


So named because it is often ''literal'' prosecutors, of crimes, who suffer from it. This kind of tunnel vision has led to notorious miscarriages of justice where innocent people come to be convicted beyond reasonable doubt notwithstanding clear, plausible and more credible alternative explanations.  
We have built our legal institutions to be vulnerable to this kind of crowding out. Criminal law proceeds upon [[data]] and the weight of ''evidence'' but disallows “intuition”. Hence, there is an asymmetry: evidence is better at saying what ''did'' happen than what did ''not''. This is especially so where there is no [[direct evidence]] that the defendant actually did what she is accused of.


It is not just miscarriages of justice: the same tunnel vision motivates ideologies, [[Conspiracy theory|conspiracies]] and management philosophy: 360-degree [[performance appraisal]]s, the war on drugs; the worldwide [[Anti-money laundering|AML]] military-industrial complex; the batty dogma of [[outsourcing]] are all cases where those “prosecuting” the outlook stick with it notwithstanding the weight of evidence that the theory is at best useless, and may lead to the opposite of the desired outcome.
Circumstantial evidence does not directly implicate a defendant but is [[consistent with]] the prosecution theory. It accumulates: if there is enough of it, and none points away from the defendant, it can tell us something. But, [[Correlation|correlation and causation]]: evidence that is “[[consistent with]]” a prosecution theory does not prove it: that JC owns a bicycle is ''consistent'' with his competing in the ''Tour de France''; it does not make him any more likely to ''do'' it. Evidence can look more meaningful than it is. This is where intuition ought to be able to help us.
====An asymmetry of evidence====
{{drop|A|t the end}} of the “prosecutor’s tunnel” are clear but simplistic and often misleading models of a messy world. We are pattern-matching, puzzle-solving animals. We are drawn to neatness. We and resile from intractability as it indicates ''weakness'': that our frail human intellect has been defeated by the ineffable natural order of the cosmos.  


So it is that the simplicity of the prosecutor’s case, especially when backed by [[Data modernism|data]], can crowd out our usually strong attachment to common sense, intuition and inference.
As it is, intuition’s role is relegated to underpinning the presumption of innocence. A prosecutor must prove guilt; the accused need not prove ''anything'': she cannot be expected to explain what happened for the simple reason that and innocent person should have no better idea about it than anyone else. The jury, we hope, leans on its intuition when conjuring doubts.


We have built our institutions to be vulnerable to this crowding out: we allow “data” as evidence but not “intuition”. Hence an asymmetry: data — evidence — is better at supporting a case for what ''did'' happen than for what did ''not''.  
Experience tells us otherwise. In what follows, JC takes three notorious cases from the antipodes to see what can happen when, with no direct evidence, those arguing the case become afflicted with tunnel vision, and intuition and common sense are relegated behind “data” and circumstantial evidence. Then we will look at what causes this condition.


Criminal law proceeds upon data and the weight of evidence. Intuition’s role subsists mainly in the ''presumption of innocence''. A prosecutor must prove guilt; the accused need not prove ''anything'': she cannot be expected to explain what happened for the simple reason that, if she did not do it, she has no better idea what happened than anyone else. The jury, we all hope, lean on their intuition when conjuring doubts.
{{gbullet|Case study: [[Lindy Chamberlain]]<li>Case study: [[Peter Ellis]]<li>Case study: [[David Bain]]}}
==Interlude: difficult cases from down under==
====Narrative biases====
Three notorious cases from the antipodes illustrate what is at stake here when intuition and common sense are relegated behind “data”.
{{drop|T|hese cases illustrate}} the problem of relying on circumstantial evidence: with no independent ''direct'' evidence, one tends to start with a hypothesis and fit whatever secondary and forensic evidence you have into it, discarding whatever does not fit. This is the classic [[Prosecutor’s tunnel vision|tunnel vision]] scenario. It can afflict those who would ''defend'' suspects just as firmly as those who prosecute them.
====A ring of dust around Ayers Rock====
{{drop|L|indy and Michael}} Chamberlain and their three children were camping at Ayers Rock in central Australia in August 1980.<ref>''{{plainlink|https://open.spotify.com/show/4sADNVGYAf0VnP6TqaBW3i|A Perfect Storm: The True Story of the Chamberlains}}'' is a fabulous accout of the whole affair.</ref> The adults were with other campers around a campfire when Lindy heard a disturbance near the tent where her infant daughter Azaria was sleeping. She went to check on the baby and thought she saw a dingo running out of the tent. When she got to the tent, the child had vanished.  


Lindy raised the alarm at once, but Azaria was never found.  
All kinds of theories circulated owing to the Chamberlains’ unusual religious beliefs and “odd behaviour” in the aftermath of Azaria’s disappearance. But devout Christianity is hardly a solid [[Bayesian prior|prior]] indicating a tendency to murder. Nor is “odd behaviour” in the aftermath of a mother’s most extreme psychological trauma. Who would ''not'' behave oddly in those circumstances?


Dingo attacks on humans at the time were rare, and the police believed Lindy was behaving strangely. They regarded her “dingo” explanation as absurd.<ref>A common schoolyard joke at the time: Q: What is the ring of diust rising around Ayers Rock? A: The dingoes doing a lap of honour.</ref> They, and quickly thereafter the public, concluded that Lindy had murdered and disposed of her baby.
That ''anyone'' could bring themselves to cold-bloodedly murder a nine-week-old baby is hard to imagine. Statistically, it is highly improbable. That the child’s own mother would is, in the absence of compelling evidence, ''preposterous''. To even start with this theory you must surely have compelling grounds to believe it over all other possibilities — if not credible eye-witness evidence, then a documented history of violence, behavioural volatility or psychiatric illness grave enough to overthrow the strong human instinct to protect vulnerable infants. Lindy Chamberlain had no such history.


They built their case from the little positive evidence they had: Lindy’s absence from the campfire gave her an opportunity; her strange religious beliefs — the Chamberlains were Seventh-Day Adventists — gave her a motive; what appeared to be spattered infant blood in the footwell of the Chamberlain’s car provided forensic evidence and Lindy’s odd behaviour when interviewed provided corroboration.  
If there is ''any'' plausible alternative explanation for the baby’s disappearance, ''there must have been a reasonable doubt''. It need not be more probable than the prosecution case: just ''not out of the question''. [[Lindy Chamberlain]] provided one: a dingo snatching the child might have been unprecedented, but it was possible. There were dingoes in the area. They are predators. They are strong enough to carry away a human infant. A dingo was no less likely than a new mother noiselessly murdering her own infant just yards from a group of independent witnesses. That ought to have been the end of it.


Many aspects of the police case were highly implausible: logistically, it was almost impossible for Lindy to have murdered Azaria in the way proposed — with blunt scissors — and disposed of the body and all evidence in the five minutes available to her. Never mind how unlikely it was for a mother — let alone a devout Christian mother — to murder her own infant in cold blood.
Likewise, what [[Peter Ellis]] was alleged to have done is ''extraordinarily'' improbable. There are few documented cases of ritualistic abuse on that scale anywhere in the world. There are none in New Zealand. For such a thing to have happened without any prior evidence of such behaviour, with no adult witnesses, no one noticing the absent children and for none of the children to bear any trace of their supposed injuries makes it even less likely.  


Nevertheless, in 1982, Lindy Chamberlain was convicted of Azaria’s murder and spent three and a half years in prison before Azaria’s matinee jacket was found, four kilometres from the campsite, at the entrance to a dingo lair. Lindy was released and pardoned but her conviction was not finally quashed until 1992.
And there was a plausible alternative: ''nothing happened at all''. All that was required for that to be true was for preschool children, perhaps at the prompt of interviewers already in the grip of [[prosecutor’s tunnel vision]], to make things up. By comparison with “untraceable, unwitnessed, wide-scale ritual satanic abuse”, “children exercising their imaginations to please adults” ''is not improbable''.  


The “blood spatter” in the footwell of the Chamberlains’ Holden Torana turned out, much later, to be a standard sound-deadening compound applied during the car’s manufacture.
It is different for David Bain. While it is true that familicide is extremely rare and, therefore, absent [[Bayesian prior|prior]] evidence, highly improbable, there is no question that the Bain family were murdered. The only question was ''by whom''.  


====Satanic panic in the Garden City====
On David’s own theory, only two people could have done it: his father and himself. It was, therefore, ''definitely'' familicide: the abstract improbability of that explanation is therefore beside the point. The probability that David was responsible is therefore greatly higher: before considering any further evidence there is a 50% chance he was responsible.  
{{drop|I|n 1991, Peter}} Ellis, a childcare worker at a daycare centre in New Zealand was charged with horrific abuse of several preschool children in his care.<ref>''{{plainlink|https://open.spotify.com/show/1ZHmLlciFEQ33kkSPZTtyC|Conviction: The Christchurch Civic Creche Case}}''</ref> Police alleged, on the children’s own evidence that, among other things, Ellis abducted the children en masse during the day and subjected them to bizarre rituals and acts of unthinkable cruelty and violence.  


In total, one hundred and eighteen children were interviewed by police and social workers.  
And a lot of the further evidence pointed in his direction. To ''not'' be the murderer, on his own evidence, David would have been ''extremely'' unlucky — forgetting to turn on the light, inadvertently disposing of exculpatory evidence, having incriminating injuries he could not explain — while no such evidence pointed to Robin. David’s defenders had their own [[tunnel vision]], focusing narrowly on the provenance of each piece of incriminating evidence, identifying formal shortcomings in its value as evidence: questioning the manner of its collection, the chain of custody, raising ''possibilities'' of innocent explanations without evidence to support that alternative, and disregarding the wider context of the whole case.


In 1993 Ellis was convicted on 16 counts of child sex abuse against seven children. They discarded evidence from children who did not report abuse, and also set aside patently impossible claims, meaning that the evidence put before the court — and disclosed to the defence — appeared more compelling than it would have if viewed in the wider context. It transpired later that the techniques the police and social workers used may well have encouraged the very young children to make their stories up.
Now, David Bain was acquitted of all charges. On the evidence, the jury could not rule out the ''possibility'' that Robin Bain was responsible. Not being satisfied beyond reasonable doubt that David was the perpetrator, he was correctly acquitted at law. But it remains ''likely'' that David ''was'' the perpetrator.<ref>Christchurch Journalist Martin Van Beynen’s fantastic podcast ''{{plainlink|https://interactives.stuff.co.nz/blackhands/not-guilty/|Black Hands}}'' compellingly makes this case.</ref> As a piece of judicial procedure, the comparison between Bain’s case and those of Ellis and Chamberlain is stark.  


But none of the allegations were true.
====Tunnel vision and circumstantial evidence====
{{drop|W|here there is}} reliable [[direct evidence]] — eyewitnesses, recordings, and causative links between a suspect and the allegation — there is little need for inference; the evidence speaks for itself. But cases comprised predominantly of [[circumstantial evidence]] — that therefore depend on inferential reasoning — are vulnerable to tunnel vision because the complex of cognitive biases that make up [[prosecutor’s tunnel vision]] affect the process of inference.  


Ellis maintained his innocence throughout and continued to fight for his name to be cleared, but died in 2019. New Zealand Supreme Court finally quashed all remaining convictions in 2022 citing a substantial miscarriage of justice due to unbalanced evidence and contamination of the children’s evidence.
====Upstanding citizen turns master criminal. Does well.====
{{Drop|P|rosecutor’s tunnel vision}} cases often involve hitherto law-abiding citizens suddenly committing fiendish crimes without warning, explanation or motive.  


====Murder in the family====
Now JC is, ahem, ''told'' that committing violent crime without leaving ''any'' incriminating evidence is ''extremely'' hard. Especially in a controlled environment like an infants’ daycare centre or a hospital.  
{{drop|O|n the morning}} of June 20,1994 twenty-two-year-old David Bain returned from his paper round at 6:45am to find his whole family had been shot dead. <ref>''{{plainlink|https://open.spotify.com/show/3jgt4218rVMRh9OcIRu8qd|Black Hands: A Family Mass Murder}}''</ref> He did not discover this immediately: it was still midwinter dark at that hour, deep in the Southern Hemisphere. Without switching on a light David first went downstairs to put on a load of laundry. He later told police it was so dark he did not notice his father’s bloodstained clothes on the machine, and inadvertently washed them with his own, obliterating key evidence.  


Returning upstairs, David discovered his father Robin lying in the living room beside a 22 rifle with a bullet wound in his head. Quickly thereafter he found the bodies of his mother, two sisters and youngest brother, who appeared to have put up some kind of fight before being overcome.
To be sure, serial criminals ''can'' operate in these environments but they will need to be ''good'': meticulous in their preparation and method. Over time, they will hone their techniques and perfect a ''modus operandi'', acquiring a ghoulish sort of ''[[expertise]]'' in murder: killing patients in a closely monitored, controlled environment populated by trained experts hardly lends itself to opportunistic, freestyle offending. Hospitals, in particular, overflow with specialists who can detect subtle clues that ordinary laypeople — and burgeoning criminals learning their craft — have no idea about.


A note typed on the family computer, apparently by David’s father Robin, said, “You were the only one who deserved to live.” David placed an agitated call to emergency services. It was recorded, and remains a part of the public record.  
As with any complicated discipline, one learns as one goes. We should not, therefore, expect “beginners” to perform like master jewel thieves, slipping in and out, striking in the dark and leaving no trace. They will blunder. They will be careless. ''They will leave evidence''. They will slip up, leave giveaways and clumsily trigger red flags. From new criminals, we should expect “smoking guns”.  


David told police his father, motivated by a troubled relationship with the family, must have murdered them all before turning the gun upon himself.
So if a strange confluence of events is accompanied by ''no'' smoking pistol, this too has some prior probability value.  It does not ''exclude'' the possibility of foul play, but it does make it ''less likely''.  


Based on circumstantial evidence including bloodstains on his own clothing and spectacles, fingerprints on the murder weapon, minor bruises and abrasions consistent with a struggle with his brother, and a lack of any evidence pointing to his father, David Bain was charged with all five murders and convicted on all counts.  
People do not often flip, overnight and without warning, from conscientious citizens to compulsive criminals. If they did, we would ''notice'' it.<ref>They might snap into a sudden orgy extreme violence — but this plays out as desperate, meltdown ''mass'' murder, not calculated ongoing ''serial'' murder, and there is generally no doubt that it is murder and no shortage of [[direct evidence]] implicating the accused.</ref> When hitherto law-abiding people do slide into criminality, there is generally motivation, a history of antisocial behaviour, identifiable psychological trauma, drug dependency, observed personality change over time or diagnosed mental illness.<ref>Mental illnesses having a clear medical pathology, not suspiciously made-up ones out of ex- post facto symptoms like “Munchausen by proxy”. See the “circular correspondence bias” discussion below.</ref> Often ''all'' of these things. (Let us call them “[[Criminal propensity|criminal propensities]]”.)


David maintained his innocence.
The absence of ''any'' of criminal propensities in a suspect’s makeup should ''reduce'' the “prior probability” of foul play by that suspect. As we will see, “circular correspondence bias” can take such a ''lack'' of criminal propensity and somehow invert it into confirmation.


In 1996 Joe Karam, a former New Zealand rugby international, became involved after reading a newspaper article about university students raising money to fund an appeal. Karam became fascinated by the case, and was persuaded of David’s innocence. He brought significant publicity to David’s cause and championed his innocence, uncovering shortcomings, inconsistencies and oversights in the police investigation and in the handling of evidence.
Where a crime has certainly been committed, this goes only to ''who'' the perpetrator is. There may (as in David Bain’s case) be only a small universe of credible suspects. If ''all'' “possible suspects” have the same lack of criminal propensity, it will count for little. But if the universe of “potential suspects” is large — or if it is plausible that ''no crime was committed at all'' — an individual’s lack of any criminal propensity should tell us something “circumstantial”.


Eventually, David’s case made it to the Privy Council where, twenty years after the original trial, the court quashed Bain’s convictions and ordered a retrial. The second jury was not persuaded of David’s guilt and he was acquitted of all charges. He remains a free man.
Neither Lindy Chamberlain nor Peter Ellis had any criminal propensity and both cases there was a plausible alternative explanation. For David Bain it was different.


====Standards of proof====
====Burden and standard of proof====
{{drop|T|he prosecution’s standard}} of proof is, in theory, high: ''beyond reasonable doubt''. It isn’t clear that quite achieves what it is meant to. Courts have moved to dumb it down: that time-honoured phrase has been discarded and juries are directed to convict only if they are “sure”. While this is meant to mean the same thing, not all are persuaded that is how juries understand it.<ref>{{Plainlink|https://www.newlawjournal.co.uk/content/dreaded-questions-doubtful-answers-the-trouble-with-sure-|New Law Journal: The Trouble With “Sure”}}</ref> And there is some reason to think that juries start with a presumption that the accused is guilty at least to the balance of probabilities: assuming the police acted in good faith, why else would the defendant be in the dock?
{{drop|T|he [[burden and standard of proof|''burden'' of]]}} proof is a different thing to the ''standard'' of proof. The burden is who has to prove their case: this falls squarely on the prosecution. The defence is not required to prove anything, least of all the accused’s innocence.


But a scenario where tendentious data may be introduced in support of guilt but there is a total ''lack'' of “data” supporting exoneration — only the intuition that it seems ''highly unlikely'' that such a person should do such a thing — may lead to that confusion. Lindy Chamberlain was convicted of her own daughter’s murder, with a pair of blunt scissors, on the evidence of bloodlike spatter in the footwell of her husband’s car. The intuition that a sane mother is most unlikely to brutally murder her own nine-week-old child at all, let alone with an improvised weapon and without warning or provocation was not before the court. Somehow the jury was persuaded not just that she did it, but that there was no plausible alternative explanation.
But there is tension between that crystalline legal theory and the practical reality: it is in the defendant’s interest that ''someone'' casts doubt into jurors’ minds. Since the Crown plainly won’t be doing that, the defence must either rely on jurors to confect plausible doubts by themselves, or it must plant some doubts there. It is a brave defence counsel indeed who puts her client’s future in the hands of a jury’s imagination and capacity for creative thought.


[[JC]] draws upon ''{{plainlink|https://media.law.wisc.edu/m/2fjzd/findley_scott_ssrn_copy-1.pdf|The Multiple Dimensions of Tunnel Vision in Criminal Cases}}'' by Keith Findley and Michael Scott in the Wisconsin Law Review (2006) and {{author|Robert Cialdini}}’s {{br|Persuasion}}. To some extent also the madness of crowds and Jon Haidt’s {{br|The Righteous Mind}}. The lesson we draw is that ''we are not as rational as we like to think'' and ''data is never the whole story''.
All the same, the prosecution’s ''standard'' of proof — what it must do to discharge its burden of proof — is, in theory, ''extremely'' high. Courts have dumbed down the time-honoured phrase ''[[beyond reasonable doubt]]'': these days, juries are directed to convict only if they are “''sure''”. This is meant to mean the same thing, but not everyone is persuaded that is how juries understand it.<ref>{{Plainlink|https://www.newlawjournal.co.uk/content/dreaded-questions-doubtful-answers-the-trouble-with-sure-|New Law Journal: The Trouble With “Sure”}}</ref>


It may describe ''all'' view-forming of a “conviction” kind. They are like political and religious views in that, once they take root, they are not easily displaced.
There is some reason to think that juries start with an ''[[ad hoc]]'' presumption that ''any'' defendant put before them is ''somewhat'' likely to be guilty: if the police were competent and acted in good faith, why else would the defendant be in the dock?


The “wrongful conviction” cases are bracing because, with hindsight, a better narrative and having taken a different cognitive path to the prosecutors, it is so hard to understand how they got there, or why they persisted with such plainly untenable views. If we treat prosecutor’s tunnel vision as a variety of political or even religious conviction, we can see better how “prosecutors” can be so energetic in their maintenance of a bad [[model]]. It perhaps explains the gruesome in-house performance in the {{poh}}.
So where there is only ''tendentious'' data supporting a defendant’s guilt but a total ''lack'' of “data” supporting her innocence —  what evidence could there be that you did not do something that did not happen? — there are grounds for confusion here, and there is good evidence that juries do indeed get confused.  


Prosecutors need not be ''literal'' prosecutors: campaigners for innocence, and conspiracy theorists suffer at the hands of the same collection of cognitive traps. Both sides of the public conversation about [[Lucy Letby]] are similarly afflicted with tunnel vision: hence, allegations of conspiracy from both sides.
Lindy Chamberlain was convicted of her own daughter’s murder, with a pair of blunt scissors, on the circumstantial evidence of what looked like blood sprays in the footwell of the family car.<ref>In fairness the crown submitted expert forensic analysis entered that it was specifically infant blood, so you can hardly fault the jury here. You can fault the crown forensics team though: it turned out to be acoustic deadening spray and not blood of any kind!</ref>


==The three phases of tunnel vision==
Evidence supporting the intuition that “a sane mother is most unlikely to brutally murder her own nine-week-old child at all, let alone with an improvised weapon and without warning or provocation” was not before the court. What evidence could there be of that? Somehow the jury was persuaded not just that she did murder her child, but that there was no plausible alternative explanation for the child’s disappearance. This was largely thanks to the strange collection of cognitive biases to which the prosecution had succumbed.
Tunnel vision has three phases: first, the background conditions arise to make us ''vulnerable'' to tunnel vision in the first place; secondly, those that push us into a given tunnel; the third are those cognitive artefacts that ''keep'' us there.  


Call these “setting out”, “getting there” and “staying there”.
{{drop|S|o what is}} “prosecutor’s tunnel vision”, then and how does it come about?<ref>[[JC]] draws upon ''{{plainlink|https://media.law.wisc.edu/m/2fjzd/findley_scott_ssrn_copy-1.pdf|The Multiple Dimensions of Tunnel Vision in Criminal Cases}}'' by Keith Findley and Michael Scott in the Wisconsin Law Review (2006).</ref>
It is a sort of “emotional conviction” to an (as-yet) unproven explanation. We become personally invested in a narrative; the consequences — and personal costs — of rejecting the conviction are great, and grow the more we commit to the position.


In order of appearance:
Tunnel vision has three phases: first, there must be enabling background conditions that make us ''vulnerable'' to tunnel vision; second, there are pathways ''into'' a given tunnel; third, there are cognitive biases that ''keep'' us there.


====Background====
The lessons are two-fold:
{{quote|''We are not as rational as we like to think'' and <br>''Data is never the whole story''.}}
===Background conditions===
{{Drop|C|ertain dispositions, biases}} and miscellaneous psychological tics come together to create the conditions for tunnel vision to swamp an earnestly-held narrative:
{{Drop|C|ertain dispositions, biases}} and miscellaneous psychological tics come together to create the conditions for tunnel vision to swamp an earnestly-held narrative:


=====The “anchoring” effect=====
====Mainly circumstantial evidence====
When making decisions we tend to “anchor” our expectations on the first piece of information we get, and then recalibrate as we go, not against some abstract sense of rectitude, but ''by reference to the anchor''. Our initial impression can therefore disproportionately influence the model we draw and our later assessment of responsibility.
In that [[tunnel vision]] is a collection of cognitive biases infecting how we draw ''inferences'', it usually arises where there is no [[direct evidence]] of wrongdoing. Where there are reliable eyewitnesses there is little need to ''infer'' what happened: someone ''saw'' it. Where there are no witnesses and the case depends on [[circumstantial evidence]] — even more so where it isn’t clear there was a crime ''at all'' — the jury must ''infer'' what happened from purely [[circumstantial evidence]].  


This is the theory behind the “discount sticker” in a car showroom: You are already getting a great deal, and you haven’t started haggling!<ref>Anchoring is relatively well documented. Kahneman and Tversky asked subjects to to spin a wheel of fortune and, after spinning, to estimate the percentage of African nations in the UN. Those who landed on a higher number gave significantly higher estimates than those who landed on a lower number.</ref>
There was no clear evidence Azaria Chamberlain was dead, let alone murdered: she was simply missing. Had a reliable witness seen a dingo carrying her away — even Lindy Chamberlain did not claim to have seen that — there would be little scope for inference about the significance of the red-brown material spattered under the dashboard of the Chamberlain’s car. ({{plainlink|https://www.famous-trials.com/dingo/465-galvinfindings|The coroner’s damning report to the crown prosecutor}} in the Chamberlain case is a chilling example of tunnel vision.)


=====Overconfidence in own expertise=====
====Information glut====
We [[subject matter expert]]s tend to overestimate our ability to judge subtle problems, especially those ''adjacent'' to, but not within, our expertise. We then over-emphasise our experience when forming conclusions — this is an entry condition to any professional calling — even where it might be only a relatively small part of the story.  
{{quote|{{neil postman information glut}}}}
The more [[circumstantial evidence]] there is, the more scope for inference, and the more fantastical narratives one can draw. If the alleged crime occurs in a tightly-controlled environment designed to generate technical data and specialist information, and where deep subject matter experts are on hand to observe and analyse that information in retrospect, conditions are ripe for tunnel vision.  


Where we are overly confident in our essential ''rectitude'' we are less likely to consider alternative models, explanations, theories or even evidence.
This is ''exactly'' the scenario in which the [[healthcare serial murder]] cases arise. The alleged crimes, though rarely witnessed, take place within a carefully controlled environment<ref>In the [[healthcare serial murder|cases]] the JC has managed to track down, not ''one'' involves anyone seeing the accused commit any unequivocal act of harm</ref>. Access is monitored by CCTV and controlled by swipecards and elaborate security systems. Detailed medical protocols govern and track the storage and dispensation of medicines. Sophisticated equipment — machines that go “ping” —  monitor patients’ vital signs around the clock. Nurses, orderlies, consultants and doctors constantly mill around at all hours, doing ward rounds, checking in on patients and generally keeping an eye out for signs of trouble.


=====“To a man with a hammer...=====
Though these systems seem incapable of capturing direct evidence of wrongdoing, they still generate a ''colossal'' amount of very “scientific” medical and digital data. This is capable of being analysed and framed to support — to be “[[consistent with]]— any number of different and frequently contradictory inferences and theories of the case.
You don’t enlist in the army hoping never to shoot a gun.  


The police show up for work to detect ''crime''. Prosecutors to ''prosecute'' it. They are primed this way, as are we all: to be ''useful'' and for their role in the great cosmos to be ''important'' and to ''make a difference''.
====Expert overreach====
{{quote|To a man with a hammer, everything looks like a nail.
:—Abraham Maslow}}
It is hardly news that existing knowledge and beliefs shape what we see and how we see it. Prosecutors (and defenders) hold pet theories about human behaviour just likeanyone else.  


Saying, “well, sorry, but there’s nothing to see here, folks”, packing up and going home is not professionally rewarding. That is not what gets us out of bed.  
You don’t sign up for the army if you hope not to shoot a gun. Nor do you join the police hoping to never find crime, nor take a crown warrant if you don’t expect to ''prosecute''. Those involved in prosecution are primed this way, as are we all: to be ''useful''; for their roles to be ''important'' and to ''make a difference''.  


We tell ourselves archetypal stories: the dogged sleuth who smelled a rat and stuck at it over the objections of naysayers and superiors — threatened with a career issuing parking tickets — and overcame the shadowy machinations of unseen malign forces.  
This is no less so for expert witnesses. They are incentivised to support the cases on which they are engaged.<ref>There is some debate about expert witness [[conflicts of interest]] in criminal law circles but, given how public interest there is in the question,  it has had little public exposure as yet: perhaps {{poh}} and the Letby case will change that.</ref>
Yet [[subject matter expert]]s can overestimate their ability to analyse and judge subtle problems, especially those in fields ''adjacent'' to, but not directly within, their expertise. They may over-weight the overall significance of matters that do fall within their expertise against those that do not. They are less likely to consider alternative models, explanations, theories or evidence that de-emphasise their expertise, let alone theories of the case that contradict it.  


There are no archetypes about conspiracy-obsessive geeks who hound innocent prisoners to their graves.
This kind of expert overreach is germane for the “[[healthcare serial murder]]” cases. Human biology is, in the technical sense, ''[[complex]]''. There is much about it that even experts do not yet, and may never, know. Patients may have conditions that are never tested for, and therefore never detected, before or after death. They may have conditions ''as yet unknown to medical science'', in which case they would not have been revealed by established tests in any case.


=====Role pressure=====
====High improbability, whatever the explanation====
Law enforcement agencies will be under tremendous pressure to get results. Curiously, this may lead to missed opportunities: notoriously, West Yorkshire police repeatedly missed the Yorkshire Ripper despite interviewing him several times because he didn’t match the profile they had built, which was based on letters from a hoaxer.
When the allegation involves ''extremely improbable events'' — where there is ''no'' commonly experienced explanation — our natural human weakness for statistical reasoning is in play. [[Base rate neglect]] (see below) becomes a risk.  


====Getting there====
Both possible explanations for Azaria Chamberlain’s disappearance (being snatched by a dingo and maternal infanticide) were ''extremely improbable''. What little data there was on the dingo abductions, suggested it was rare, but the data were not good.  Not many people camp with infants in the Outback. Dingoes rarely get the chance to take them. There is a ''lot'' of evidence that maternal infanticide is ''extremely'' improbable: ''You'', dear reader, are evidence of that.
{{Drop|O|nce you are}} safely anchored with your hammer and have started wandering around the house looking for nails, there should still be scope for falsification of your operating theory. But again, psychological biases can override the dispassionate application of cool logic.


=====Extrapolation=====
Now, had dingo attacks been commonly reported before Azaria’s disappearance the possibility of maternal infanticide may never have come up.
Especially vulnerable are [[subject matter expert]]s. We are natural problem-solvers and model builders and we will easily slip beyond our brief and intrude into matters where we have little experience. A statistician can give us a compelling account of the Bayesian probabilities, but when she strays into the causes of elevated insulin readings in a sample containing an apparently significant cluster of readings. Likewise, a medical expert may opine ''that'' the insulin readings are elevated beyond what would usually be expected, but the forensic question of ''who or what caused the excess insulin levels'' is a question of forensics and not something a diabetes specialist has any better idea about than anyone else.  


=====Precedent: bad heuristics=====
The [[healthcare serial murder]]ers” cases typically have this same feature: both the crime (serial murder) ''and'' the “innocent alternative” (an unusual cluster of natural deaths coinciding with the attendance same nurse) are intrinsically improbable. Base rate neglect — healthcare serial murders remain vanishingly unlikely — is a real risk.
The greater the expertise, the more grooved the expectations, the stronger the [[heuristic]], the greater our temptation to take that shortcut and ''presume'' that this is “one of those” cases. Often the subject matter experts model will be right: if so this heuristic is a handy, efficient shortcut. In the rare case that presents one way but is an exception, it can be dangerous. Heuristics are excellent, efficient devices (as [[Gerd Gigerenzer]] notes, they help us to catch balls without needing to perform differential equations), but when the model is wrong they can lead to trouble.
=====Base rate neglect=====
“Base rate neglect” also known as the prosecutor’s fallacy — is our natural tendency of to ignore the “base rate” of a phenomenon in the population and instead focus on “individuating” information that pertains to the specific case in front of us.  


If, statistically, a certain test has a 1/1000 chance of yielding a “false positive” for a given illness, but the general prevalence of that illness in the population is 1/100,000 then for every one true positive result, we should still expect 100 false positives. Worth remembering if you are diagnosed with a rare illness!
===Getting there===
{{Drop|O|nce you are}} equipped with a hammer and have started wandering around the house looking for nails, there should still be scope to falsify a bad operating theory. But again, psychological biases can override the dispassionate application of cool logic.


The same principle holds for criminal offending: if there is a 1/342,000,000 chance that “so many suspicious deaths could have occurred with the same nurse on duty by sheer chance” it may seem that the fact one nurse was in fact on duty for all those suspicious deaths is damning. But this is to ignore the base rate: How many health professionals are there in the world with no criminal history, motive or mental illness who murder multiple patients?
====Hopeful extrapolation ====
We are natural riddle-solvers. We build models of the world by habit. It is easy to slip beyond our range of reliable experiences and form theories for which we have little expertise, especially where abundant circumstantial evidence is [[consistent with]] — able to be fitted to — our side of the argument, rather than dispositive of it.


For the odds to be even with sheer chance — that is the “balance of probabilities”, remember, a long way short of “beyond reasonable doubt” — there would need to be ''twenty-three''. There are not twenty-three such serial killers in the world.
Litigation’s adversarial nature, in which advocates are meant to present their arguments in the best possible light, emphasising helpful facts and neglecting unhelpful ones, hardly helps. The underlying philosophy here is akin to Adam Smith’s “invisible hand”: from the interaction of opposed, self-interested advocates we expect the invisible hand of justice miraculously to emerge.


Yet “one in three-hundred and forty-two million” was the figure that convicted Dutch nurse Lucia de Berk of serial murder. Even if this figure had been correct, sheer chance was still the far likelier explanation.  
====Base rate neglect====
''Lawyers are not natural statisticians''. We should be careful with statistics especially when they concern extremely improbable events.  


It turned out the statistics were in any case wrong: the probability of her shift patterns coinciding by chance was reassessed as being more like one in twenty-five. For the odds of serial murder to be even with that, there would need to be 320 ''million'' hospital serial killers.  
“Base rate neglect” — the “prosecutor’s fallacy” — is the natural tendency to ignore the “base rate” of an outcome in a population and instead focus on “individuating” information about the specific case.  


Observers might note the similarities between this case and a British case that is currently in the news.
Linda is a single, middle-aged female philosophy graduate who is active in CND. If asked which description of Linda is more likely to be true, most people will choose “she is a bank teller who is active in the feminist movement” over “she is a bank teller” when plainly the former is a subset of the latter.<ref>{{Plainlink|https://www.psychologytoday.com/gb/blog/the-superhuman-mind/201611/linda-the-bank-teller-case-revisited|per Kahneman and Tversky}}. This work is not without its critics. JC wonders whether the same thing propels the commercial solicitor’s compulsion to over-description.</ref>
=====Confirmation bias=====
All observation is theory-dependent: scientists must first have a theory before they can gather evidence to test it: otherwise, how do they know what evidence to look for?


Having picked a bad [[heuristic]], we tend to seek out, interpret, and best remember information that confirms it. We may overweight evidence that supports our theory and disregard or minimise anything that contradicts it.
Say a medical test is expected to give a correct result 999 times out of 1,000. We are tempted to assume, therefore, that any positive result will be pretty much conclusive. But if the general prevalence of the condition in the population — the “base rate”  — is just 1 in 100,000 then for every ''true'' positive result, we should expect 100 ''false'' ones. The probability that a positive test is accurate is only 1%. <ref>This sounds properly nuts but it is true. Assume you test 100,000 people. At a 99.9% accuracy rate, you will expect 99,900 correct results, and 100 false ones. But in that sample of 100,000 you would only expect 1 actual case of the condition. So if you are tested and receive a positive result, you have a 1 in 100 chance that it is a true result.</ref>


Convinced that Lindy Chamberlain had murdered her infant daughter Azaria, Police searched high and low for blood, eventually finding some splattered over the footwell of her husband’s car. This was crucial evidence in her conviction turned out to be sound-deadening material containing iron oxide that was present on all cars of that model.
Take the [[healthcare serial murder]]er: say there is a 1 in 342 ''million'' chance that a nurse would be on duty for all suspicious deaths in a cluster at random. If one nurse ''was'' on duty for all those suspicious deaths, we might suppose it to be damning. The jury in the trial of Dutch nurse Lucia de Berk did: they convicted her of serial murder in 2010.  


====Selective information processing====
But this is to ignore the base rate. With no other prior information suggesting her guilt (in de Berk’s case, there was none) what is the probability that a given individual is a [[healthcare serial murder]]er?
Focusing on certain pieces of evidence while ignoring others. Prosecutors might only present evidence that strengthens their case and neglect exculpatory evidence that could help the defence. Peter Ellis’ prosecutors interviewed twenty or more children. Some gave plainly preposterous accounts of what went on. They were not called to give evidence and their statements were considered irrelevant and therefore were not all made available to the defence.


====Groupthink====
For those odds to be ''as likely'' — not more — as random chance — there would need to be ''twenty-three'' [[healthcare serial killers]] operating at a given moment. There do not appear to be that many in the world, let alone in healthcare environments.
Thinking or making decisions as a group in a way that discourages creativity or individual responsibility: Prosecutors might conform to the prevailing opinion within their office, stifling dissenting views and critical analysis of the case. See also [[Dan Davies]]’ idea of  “[[accountability sink]]s”. {{poh}} is perhaps the archetypal example of groupthink.


====Reductionism====
Had that probability been correct (it turned out to be a wild underestimate) sheer chance was ''still'' a likelier explanation that Lucia de Berk was a serial murderer. (The probability of her shift patterns coinciding by chance was subsequently reassessed as being more like 1 in 25. De Berk’s conviction was overturned in 2010.
Drilling deep into technical details that, by themselves, and shorn of all context, seem to lead to one conclusion — especially one you are already [[anchor]]ed to — notwithstanding the wider picture making the hypothesis unlikely. Especially in cases with no direct evidence, there is a great risk of this.  


Prosecutors’ focus on “blood”  sprayed up in the footwell of the Chamberlains car led them to a theory that Azaria was murdered there, despite no evidence supporting the theory, and quite a lot — principally, the lack of time for Lindy Chamberlain to do any such thing. The prosecution case started with “murder in the car” as the anchoring evidence, and hypothesised a whole story around it, for which there was no supporting evidence but also no contradiucting evidence, so it was “possible”. There is a ''lot'' of this in the Lucy Letby case, on both sides.
====Confirmation bias====
All observation is theory-dependent: scientists must first have a theory before they can test it: otherwise, how do they know what they are looking for?


==Staying there==
Having committed to a bad theory, we tend to seek out, interpret, and best remember information that confirms it. We may over-weight supporting evidence and disregard contradicting evidence. This is what the adversarial system expects.
 
This is confirmation bias. It has two phases. The first applies in “theory formation”: the man with a hammer is primed to see nails.
 
Suspicious that strangely-behaving [[Lindy Chamberlain]] might have something to hide, Police searched her possessions, tent and vehicle, eventually finding what they believed to be “[[fetal blood]]” splashed under the dashboard of her husband’s car.<ref>Much later the substance was retested. It turned out to be sound-deadening material containing iron oxide.</ref> This set the police in a direction. Before long they had constructed an elaborate theory of how [[Lindy Chamberlain]] had committed murder.
 
The second aspect is in theory ''corroboration''. Here evidence which, when taken in the abstract, would tend to “be [[consistent with]]” innocence can, with confirmation bias, be presented as damning.
 
We would ''expect'' an innocent mother to behave “oddly” in the aftermath of her infant daughter’s disappearance. We would ''expect'' an innocent nurse to lead a healthy social life. We would ''expect'' an innocent defendant not to admit his crimes or express remorse for his actions.
 
But these are exactly the behaviours one might expect of the innocent: things that point ''away'' from the defendant’s guilt appear, with confirmation bias, to point ''towards'' it.
 
===Staying there===
{{drop|O|nce we are}} in the tunnel, there are cognitive biases that prevent us from finding our way out.
==== Hindsight bias and the reiteration effect ====
==== Hindsight bias and the reiteration effect ====
In hindsight, people tend to think an eventual outcome was inevitable, or more likely or predictable, than they might have before it happened. “What is the chance that that nice woman you met at the campsite just now will, in three hours, brutally murder her own nine-week old infant?” versus “Given that this nine-week old child has disappeared from the campsite, and the police suspect the mother of foul play, what is the prospect that her mother brutally murdered the child?”
In hindsight, people tend to think an eventual outcome was inevitable or more likely than they might have before it happened. “What is the chance that that nice woman you met at the campsite just now will, in three hours, brutally murder her own nine-week-old infant?” versus “Given that this woman’s nine-week-old infant has disappeared from the campsite, and the police suspect her of foul play, what is the chance that she brutally murdered her own child?”


Through “hindsight bias” we project new knowledge (of actual outcomes) onto our knowledge of the past (observed behaviour), without realising that the perception of the past has been tainted by the subsequent information.
Through “hindsight bias” we project onto ''outcomes'' our knowledge of the observed behaviour in the past, without realising that the perception of the past has been tainted by our knowledge of the outcome.


Once a person becomes a prime suspect and prosecutors arrive at an outcome in their own determination of who ''they'' believe is guilty — hindsight bias suggests that, upon reflection, the suspect was the inevitable and likely suspect from the beginning. Evidence is malleable in light of this “realisation”.  
Once a person becomes a prime suspect, hindsight bias suggests that, upon reflection, she was the likely suspect from the beginning. Evidence is malleable in light of this “realisation”.  


This is compounded by a “reiteration” effect. Our confidence in a theory increases the more we hear it, independent of its truth or falsity. The longer that police, prosecutors and witnesses live with a conclusion of guilt, the more entrenched their conclusion becomes, and the more obvious it appears that all evidence pointed to that conclusion from the very beginning. This “reiteration effect” makes it increasingly difficult for police and prosecutors to consider alternative perpetrators or theories of a crime.
There is also a “reiteration” effect. Our confidence in a theory increases the more we hear it, independent of its validity. The longer that police, prosecutors and witnesses live with the conclusion of guilt, the more they become invested in it.  Their conclusions become entrenched, and it appears obvious that all evidence pointed to the defendant from the outset. Prosecutors and defenders find it increasingly hard to consider alternative theories of situation.


====Outcome bias====
====Randle McMurphy’s dilemma====
Like hindsight bias, “outcome bias” involves projecting subsequent “outcomes” onto observed behaviour, only about the quality of a suspect’s decision. Subjects are more likely to judge as bad a suspect’s decision to operate when they are told the patient died during surgery than when told the patient survived. This is the operator error presumption from {{fieldguide}}
“Correspondence bias”, or the “fundamental attribution error” automatically attributes observed behaviour to malice without considering other explanations. Our favourite, per [[Hanlon’s razor]], is stupidity:
{{quote|{{hanlon’s razor}}}}
But it applies just as well to innocent explanations, and alternative guilty ones. It works like this:
=====The circular correspondence bias model=====
A strangely prevalent form of ''circular'' correspondence bias works as follows:


====Sunk cost fallacy====
There is ''weak'' [[circumstantial evidence]] that X committed a crime.
The inclination to continue an endeavour once money, effort, time ''or credibility'' has been invested, even when new evidence suggests the defendant might be innocent. (see also [[commitment]] when talking about [[persuasion]])
 
==Antidotes==
The (highly unusual) traits of people who commit that sort of (highly unusual) crime are attributed to X (this is a “fundamental attribution error”).
{{quote|
 
Q: How many psychiatrists does it take to change a light bulb? <br>
X’s newly-attributed traits are cited as evidence corroborating the existing weak circumstantial evidence. This is circular.
A: Just one; but the light bulb really has to want to change.}}
 
Some strategies to counteract the effect, but the predominant one is to ''want'' to keep an open mind.
X is now characterised as a “sociopath”, “narcissist”, “attention-seeker”, or even “Munchausen’s syndrome by proxy sufferer”, so is more likely to have committed the highly unusual crime.
====Hanlon’s, and Otto’s razor====
 
{{quote|“Do not attribute to ''malice'' things that can just as well be explained by ''stupidity''.
Other aspects of X’s behaviour which, but for the allegation, would be normal, now appear to verify the trait. (He socialises normally. He sends condolences. He works extra shifts.) This is, of course, is also circular.
:—''[[Hanlon’s razor]]''}}
 
Don’t assume malice where stupidity will do; likewise, per Otto’s razor, don’t attribute to ''virtue'' something that could equally be attributed to self-interest; or to ''skill'' something that could equally be attributed to dumb ''luck''.
X is convicted on the compelling evidence of the opportunity and his highly unusual and suspicious behaviour.
 
You might recognise this as the plot driver of Ken Kesey’s ''One Flew Over the Cuckoo’s Nest''.
 
=====A worked example: Doctor Bob=====
So, say there is an unusual cluster of deaths in the geriatric ward of a given hospital. All deaths —eight of them, over six months — coincide with the shift pattern of one doctor, Bob.
 
This raises a weak but, Bob thinks, easily rebuttable presumption of foul play on Bob’s part. (“Since I didn’t do it, there will be no direct evidence and no strong circumstantial evidence specifically implicating me.”)
 
Before this statistical correlation, Bob was not under suspicion. Nor was his behaviour unusual: he was a normal young, diligent doctor with an active social life. Nor does his behaviour change ''after'' the cluster of deaths.
 
But the foul play — if that’s what it is — is ''horrific'': someone is systematically murdering little old ladies. Only a psychopath with a stone-cold heart and a narcissistic personality disorder could do that. If it is Bob, he must be a psychopath with a narcissistic personality disorder. This logic is already circular, but the circle is big and conditional enough not to be obvious.
 
We now start inspecting Bob’s behaviour. We find him to be meticulously tidy. He enjoys socialising, attending a regular salsa class on Wednesdays. He sent condolence cards to the families of several deceased patients.
 
Now: what could be more indicative of a stone-cold, narcissistic psychopath who has just murdered someone than a fastidiously tidy person — hence, no evidence, right? — who sent a card to the victim’s wife before going out drinking and dancing with his friends? The net is closing in.
=====A second worked example: Nurse Lucy=====
How an odd series of coincidences might wind up with a diligent neonatal nurse being handed seven whole-of-life sentences:
 
There has been an unusually large number of explained collapses. (The thing about random events is that they are clumpy and not perfectly smooth and predictable, but let’s park that).
 
One explanation for this cluster is that someone is harming babies. (Another is that it could be one or a combination of any number of factors, including a literally infinite set of things we don’t even know about, that could also possibly have prompted this cluster, or indeed that, per step 1, the cluster is simply a truly random variation that was not prompted by anything in particular, but let's park those too)
 
(''hospital administrators'') Hey, there's a nurse who was on duty for a *lot* of these collapses. Most of them, in fact.
 
So it is not beyond possibility that someone *is* deliberately harming babies! Whoah. That’s serious. Let us be on the safe side. Let’s call the police!
 
Please, Mr Policeman, sir, we have had a spike in unexplained deaths in our ICU and we can’t rule out foul play.
 
(''Mr Police''): But this is serious! What makes you think there is foul play?
 
Well, without wishing to tell tales there ''is'' this nurse who keeps turning up like a bad penny. So there is that. But we don’t know: it may be a coincidence.
 
(Mr Police): Well, let’s have a look at the data! Only a limited number of people have the plausible opportunity to harm babies deliberately — Nurses and doctors in the ICU, pretty much — so we should be able to rule them all out pretty quickly right? Let's have a look at the shift rota to see who was on duty.
 
(''Mr Police''). Well lookee here: there is one, just one nurse, who was on duty for pretty much<ref>Rather embarrassingly, it turns out that the shift rota was erroneous and this nurse was ''not''  on duty on every single case, but hey ho.
</ref> every single one of those collapses. Ladies and gentlemen, I think we have our answer.
 
(''Hospital administrators aside''): I always knew she was a wrong ’un.

Latest revision as of 21:52, 16 September 2024

Prosecutor’s tunnel vision
/ˈprɒsɪkjuːtəz/ /ˈtʌnᵊl/ /ˈvɪʒᵊn/ (n.)
The collection of biases and cognitive gin-traps that can lead prosecutors — those who “prosecute” a particular theory of the world — to stick with it, however starkly it may vary from available evidence and common sense.

So named because it is often literal prosecutors, of crimes, who suffer from it. This kind of tunnel vision has led to notorious miscarriages of justice where innocent people come to be convicted notwithstanding clear and plausible alternative explanations for their ostensible “crimes”.

By tunnel vision, we mean that “compendium of common heuristics and logical fallacies,” to which we are all susceptible, that lead actors in the criminal justice system to “focus on a suspect, select and filter the evidence that will “build a case” for conviction, while ignoring or suppressing evidence that points away from guilt.” This process leads investigators, prosecutors, judges, and defence lawyers alike to focus on a particular conclusion and then filter all evidence in a case through the lens provided by that conclusion.

The Multiple Dimensions of Tunnel Vision in Criminal Cases by Keith Findley and Michael Scott (2006)

The same tunnel vision also motivates ideologies, conspiracies and management philosophy: 360-degree performance appraisals, outsourcing, the war on drugs; the worldwide AML military-industrial complex: are all cases where those “prosecuting” the theory stick with it even though the weight of evidence suggests it does not work and may even be counterproductive.

The “prosecutor’s tunnel” begins with clear but simplistic — misleading — models of a messy world. Humans have a weakness for these: we are pattern-matching, puzzle-solving animals. We are drawn to neatness. We resile from intractability as it indicates weakness: that our frail human intellect has been defeated by the ineffable natural order of things.

An elegant hypothesis

Sometimes the sheer elegance of a prosecutor’s case can crowd out common sense and the basic intuition that this cannot be right.

We have built our legal institutions to be vulnerable to this kind of crowding out. Criminal law proceeds upon data and the weight of evidence but disallows “intuition”. Hence, there is an asymmetry: evidence is better at saying what did happen than what did not. This is especially so where there is no direct evidence that the defendant actually did what she is accused of.

Circumstantial evidence does not directly implicate a defendant but is consistent with the prosecution theory. It accumulates: if there is enough of it, and none points away from the defendant, it can tell us something. But, correlation and causation: evidence that is “consistent with” a prosecution theory does not prove it: that JC owns a bicycle is consistent with his competing in the Tour de France; it does not make him any more likely to do it. Evidence can look more meaningful than it is. This is where intuition ought to be able to help us.

As it is, intuition’s role is relegated to underpinning the presumption of innocence. A prosecutor must prove guilt; the accused need not prove anything: she cannot be expected to explain what happened for the simple reason that and innocent person should have no better idea about it than anyone else. The jury, we hope, leans on its intuition when conjuring doubts.

Experience tells us otherwise. In what follows, JC takes three notorious cases from the antipodes to see what can happen when, with no direct evidence, those arguing the case become afflicted with tunnel vision, and intuition and common sense are relegated behind “data” and circumstantial evidence. Then we will look at what causes this condition.

Narrative biases

These cases illustrate the problem of relying on circumstantial evidence: with no independent direct evidence, one tends to start with a hypothesis and fit whatever secondary and forensic evidence you have into it, discarding whatever does not fit. This is the classic tunnel vision scenario. It can afflict those who would defend suspects just as firmly as those who prosecute them.

All kinds of theories circulated owing to the Chamberlains’ unusual religious beliefs and “odd behaviour” in the aftermath of Azaria’s disappearance. But devout Christianity is hardly a solid prior indicating a tendency to murder. Nor is “odd behaviour” in the aftermath of a mother’s most extreme psychological trauma. Who would not behave oddly in those circumstances?

That anyone could bring themselves to cold-bloodedly murder a nine-week-old baby is hard to imagine. Statistically, it is highly improbable. That the child’s own mother would is, in the absence of compelling evidence, preposterous. To even start with this theory you must surely have compelling grounds to believe it over all other possibilities — if not credible eye-witness evidence, then a documented history of violence, behavioural volatility or psychiatric illness grave enough to overthrow the strong human instinct to protect vulnerable infants. Lindy Chamberlain had no such history.

If there is any plausible alternative explanation for the baby’s disappearance, there must have been a reasonable doubt. It need not be more probable than the prosecution case: just not out of the question. Lindy Chamberlain provided one: a dingo snatching the child might have been unprecedented, but it was possible. There were dingoes in the area. They are predators. They are strong enough to carry away a human infant. A dingo was no less likely than a new mother noiselessly murdering her own infant just yards from a group of independent witnesses. That ought to have been the end of it.

Likewise, what Peter Ellis was alleged to have done is extraordinarily improbable. There are few documented cases of ritualistic abuse on that scale anywhere in the world. There are none in New Zealand. For such a thing to have happened without any prior evidence of such behaviour, with no adult witnesses, no one noticing the absent children and for none of the children to bear any trace of their supposed injuries makes it even less likely.

And there was a plausible alternative: nothing happened at all. All that was required for that to be true was for preschool children, perhaps at the prompt of interviewers already in the grip of prosecutor’s tunnel vision, to make things up. By comparison with “untraceable, unwitnessed, wide-scale ritual satanic abuse”, “children exercising their imaginations to please adults” is not improbable.

It is different for David Bain. While it is true that familicide is extremely rare and, therefore, absent prior evidence, highly improbable, there is no question that the Bain family were murdered. The only question was by whom.

On David’s own theory, only two people could have done it: his father and himself. It was, therefore, definitely familicide: the abstract improbability of that explanation is therefore beside the point. The probability that David was responsible is therefore greatly higher: before considering any further evidence there is a 50% chance he was responsible.

And a lot of the further evidence pointed in his direction. To not be the murderer, on his own evidence, David would have been extremely unlucky — forgetting to turn on the light, inadvertently disposing of exculpatory evidence, having incriminating injuries he could not explain — while no such evidence pointed to Robin. David’s defenders had their own tunnel vision, focusing narrowly on the provenance of each piece of incriminating evidence, identifying formal shortcomings in its value as evidence: questioning the manner of its collection, the chain of custody, raising possibilities of innocent explanations without evidence to support that alternative, and disregarding the wider context of the whole case.

Now, David Bain was acquitted of all charges. On the evidence, the jury could not rule out the possibility that Robin Bain was responsible. Not being satisfied beyond reasonable doubt that David was the perpetrator, he was correctly acquitted at law. But it remains likely that David was the perpetrator.[1] As a piece of judicial procedure, the comparison between Bain’s case and those of Ellis and Chamberlain is stark.

Tunnel vision and circumstantial evidence

Where there is reliable direct evidence — eyewitnesses, recordings, and causative links between a suspect and the allegation — there is little need for inference; the evidence speaks for itself. But cases comprised predominantly of circumstantial evidence — that therefore depend on inferential reasoning — are vulnerable to tunnel vision because the complex of cognitive biases that make up prosecutor’s tunnel vision affect the process of inference.

Upstanding citizen turns master criminal. Does well.

Prosecutor’s tunnel vision cases often involve hitherto law-abiding citizens suddenly committing fiendish crimes without warning, explanation or motive.

Now JC is, ahem, told that committing violent crime without leaving any incriminating evidence is extremely hard. Especially in a controlled environment like an infants’ daycare centre or a hospital.

To be sure, serial criminals can operate in these environments but they will need to be good: meticulous in their preparation and method. Over time, they will hone their techniques and perfect a modus operandi, acquiring a ghoulish sort of expertise in murder: killing patients in a closely monitored, controlled environment populated by trained experts hardly lends itself to opportunistic, freestyle offending. Hospitals, in particular, overflow with specialists who can detect subtle clues that ordinary laypeople — and burgeoning criminals learning their craft — have no idea about.

As with any complicated discipline, one learns as one goes. We should not, therefore, expect “beginners” to perform like master jewel thieves, slipping in and out, striking in the dark and leaving no trace. They will blunder. They will be careless. They will leave evidence. They will slip up, leave giveaways and clumsily trigger red flags. From new criminals, we should expect “smoking guns”.

So if a strange confluence of events is accompanied by no smoking pistol, this too has some prior probability value. It does not exclude the possibility of foul play, but it does make it less likely.

People do not often flip, overnight and without warning, from conscientious citizens to compulsive criminals. If they did, we would notice it.[2] When hitherto law-abiding people do slide into criminality, there is generally motivation, a history of antisocial behaviour, identifiable psychological trauma, drug dependency, observed personality change over time or diagnosed mental illness.[3] Often all of these things. (Let us call them “criminal propensities”.)

The absence of any of criminal propensities in a suspect’s makeup should reduce the “prior probability” of foul play by that suspect. As we will see, “circular correspondence bias” can take such a lack of criminal propensity and somehow invert it into confirmation.

Where a crime has certainly been committed, this goes only to who the perpetrator is. There may (as in David Bain’s case) be only a small universe of credible suspects. If all “possible suspects” have the same lack of criminal propensity, it will count for little. But if the universe of “potential suspects” is large — or if it is plausible that no crime was committed at all — an individual’s lack of any criminal propensity should tell us something “circumstantial”.

Neither Lindy Chamberlain nor Peter Ellis had any criminal propensity and both cases there was a plausible alternative explanation. For David Bain it was different.

Burden and standard of proof

The burden of proof is a different thing to the standard of proof. The burden is who has to prove their case: this falls squarely on the prosecution. The defence is not required to prove anything, least of all the accused’s innocence.

But there is tension between that crystalline legal theory and the practical reality: it is in the defendant’s interest that someone casts doubt into jurors’ minds. Since the Crown plainly won’t be doing that, the defence must either rely on jurors to confect plausible doubts by themselves, or it must plant some doubts there. It is a brave defence counsel indeed who puts her client’s future in the hands of a jury’s imagination and capacity for creative thought.

All the same, the prosecution’s standard of proof — what it must do to discharge its burden of proof — is, in theory, extremely high. Courts have dumbed down the time-honoured phrase beyond reasonable doubt: these days, juries are directed to convict only if they are “sure”. This is meant to mean the same thing, but not everyone is persuaded that is how juries understand it.[4]

There is some reason to think that juries start with an ad hoc presumption that any defendant put before them is somewhat likely to be guilty: if the police were competent and acted in good faith, why else would the defendant be in the dock?

So where there is only tendentious data supporting a defendant’s guilt but a total lack of “data” supporting her innocence — what evidence could there be that you did not do something that did not happen? — there are grounds for confusion here, and there is good evidence that juries do indeed get confused.

Lindy Chamberlain was convicted of her own daughter’s murder, with a pair of blunt scissors, on the circumstantial evidence of what looked like blood sprays in the footwell of the family car.[5]

Evidence supporting the intuition that “a sane mother is most unlikely to brutally murder her own nine-week-old child at all, let alone with an improvised weapon and without warning or provocation” was not before the court. What evidence could there be of that? Somehow the jury was persuaded not just that she did murder her child, but that there was no plausible alternative explanation for the child’s disappearance. This was largely thanks to the strange collection of cognitive biases to which the prosecution had succumbed.

So what is “prosecutor’s tunnel vision”, then and how does it come about?[6] It is a sort of “emotional conviction” to an (as-yet) unproven explanation. We become personally invested in a narrative; the consequences — and personal costs — of rejecting the conviction are great, and grow the more we commit to the position.

Tunnel vision has three phases: first, there must be enabling background conditions that make us vulnerable to tunnel vision; second, there are pathways into a given tunnel; third, there are cognitive biases that keep us there.

The lessons are two-fold:

We are not as rational as we like to think and
Data is never the whole story.

Background conditions

Certain dispositions, biases and miscellaneous psychological tics come together to create the conditions for tunnel vision to swamp an earnestly-held narrative:

Mainly circumstantial evidence

In that tunnel vision is a collection of cognitive biases infecting how we draw inferences, it usually arises where there is no direct evidence of wrongdoing. Where there are reliable eyewitnesses there is little need to infer what happened: someone saw it. Where there are no witnesses and the case depends on circumstantial evidence — even more so where it isn’t clear there was a crime at all — the jury must infer what happened from purely circumstantial evidence.

There was no clear evidence Azaria Chamberlain was dead, let alone murdered: she was simply missing. Had a reliable witness seen a dingo carrying her away — even Lindy Chamberlain did not claim to have seen that — there would be little scope for inference about the significance of the red-brown material spattered under the dashboard of the Chamberlain’s car. (The coroner’s damning report to the crown prosecutor in the Chamberlain case is a chilling example of tunnel vision.)

Information glut

The fact is, there are very few political, social, and especially personal problems that arise because of insufficient information. Nonetheless, as incomprehensible problems mount, as the concept of progress fades, as meaning itself becomes suspect, the Technopolist stands firm in believing that what the world needs is yet more information.[7]

The more circumstantial evidence there is, the more scope for inference, and the more fantastical narratives one can draw. If the alleged crime occurs in a tightly-controlled environment designed to generate technical data and specialist information, and where deep subject matter experts are on hand to observe and analyse that information in retrospect, conditions are ripe for tunnel vision.

This is exactly the scenario in which the healthcare serial murder cases arise. The alleged crimes, though rarely witnessed, take place within a carefully controlled environment[8]. Access is monitored by CCTV and controlled by swipecards and elaborate security systems. Detailed medical protocols govern and track the storage and dispensation of medicines. Sophisticated equipment — machines that go “ping” — monitor patients’ vital signs around the clock. Nurses, orderlies, consultants and doctors constantly mill around at all hours, doing ward rounds, checking in on patients and generally keeping an eye out for signs of trouble.

Though these systems seem incapable of capturing direct evidence of wrongdoing, they still generate a colossal amount of very “scientific” medical and digital data. This is capable of being analysed and framed to support — to be “consistent with” — any number of different and frequently contradictory inferences and theories of the case.

Expert overreach

To a man with a hammer, everything looks like a nail.

—Abraham Maslow

It is hardly news that existing knowledge and beliefs shape what we see and how we see it. Prosecutors (and defenders) hold pet theories about human behaviour just likeanyone else.

You don’t sign up for the army if you hope not to shoot a gun. Nor do you join the police hoping to never find crime, nor take a crown warrant if you don’t expect to prosecute. Those involved in prosecution are primed this way, as are we all: to be useful; for their roles to be important and to make a difference.

This is no less so for expert witnesses. They are incentivised to support the cases on which they are engaged.[9] Yet subject matter experts can overestimate their ability to analyse and judge subtle problems, especially those in fields adjacent to, but not directly within, their expertise. They may over-weight the overall significance of matters that do fall within their expertise against those that do not. They are less likely to consider alternative models, explanations, theories or evidence that de-emphasise their expertise, let alone theories of the case that contradict it.

This kind of expert overreach is germane for the “healthcare serial murder” cases. Human biology is, in the technical sense, complex. There is much about it that even experts do not yet, and may never, know. Patients may have conditions that are never tested for, and therefore never detected, before or after death. They may have conditions as yet unknown to medical science, in which case they would not have been revealed by established tests in any case.

High improbability, whatever the explanation

When the allegation involves extremely improbable events — where there is no commonly experienced explanation — our natural human weakness for statistical reasoning is in play. Base rate neglect (see below) becomes a risk.

Both possible explanations for Azaria Chamberlain’s disappearance (being snatched by a dingo and maternal infanticide) were extremely improbable. What little data there was on the dingo abductions, suggested it was rare, but the data were not good. Not many people camp with infants in the Outback. Dingoes rarely get the chance to take them. There is a lot of evidence that maternal infanticide is extremely improbable: You, dear reader, are evidence of that.

Now, had dingo attacks been commonly reported before Azaria’s disappearance the possibility of maternal infanticide may never have come up.

The “healthcare serial murderers” cases typically have this same feature: both the crime (serial murder) and the “innocent alternative” (an unusual cluster of natural deaths coinciding with the attendance same nurse) are intrinsically improbable. Base rate neglect — healthcare serial murders remain vanishingly unlikely — is a real risk.

Getting there

Once you are equipped with a hammer and have started wandering around the house looking for nails, there should still be scope to falsify a bad operating theory. But again, psychological biases can override the dispassionate application of cool logic.

Hopeful extrapolation

We are natural riddle-solvers. We build models of the world by habit. It is easy to slip beyond our range of reliable experiences and form theories for which we have little expertise, especially where abundant circumstantial evidence is consistent with — able to be fitted to — our side of the argument, rather than dispositive of it.

Litigation’s adversarial nature, in which advocates are meant to present their arguments in the best possible light, emphasising helpful facts and neglecting unhelpful ones, hardly helps. The underlying philosophy here is akin to Adam Smith’s “invisible hand”: from the interaction of opposed, self-interested advocates we expect the invisible hand of justice miraculously to emerge.

Base rate neglect

Lawyers are not natural statisticians. We should be careful with statistics especially when they concern extremely improbable events.

“Base rate neglect” — the “prosecutor’s fallacy” — is the natural tendency to ignore the “base rate” of an outcome in a population and instead focus on “individuating” information about the specific case.

Linda is a single, middle-aged female philosophy graduate who is active in CND. If asked which description of Linda is more likely to be true, most people will choose “she is a bank teller who is active in the feminist movement” over “she is a bank teller” when plainly the former is a subset of the latter.[10]

Say a medical test is expected to give a correct result 999 times out of 1,000. We are tempted to assume, therefore, that any positive result will be pretty much conclusive. But if the general prevalence of the condition in the population — the “base rate” — is just 1 in 100,000 then for every true positive result, we should expect 100 false ones. The probability that a positive test is accurate is only 1%. [11]

Take the healthcare serial murderer: say there is a 1 in 342 million chance that a nurse would be on duty for all suspicious deaths in a cluster at random. If one nurse was on duty for all those suspicious deaths, we might suppose it to be damning. The jury in the trial of Dutch nurse Lucia de Berk did: they convicted her of serial murder in 2010.

But this is to ignore the base rate. With no other prior information suggesting her guilt (in de Berk’s case, there was none) what is the probability that a given individual is a healthcare serial murderer?

For those odds to be as likely — not more — as random chance — there would need to be twenty-three healthcare serial killers operating at a given moment. There do not appear to be that many in the world, let alone in healthcare environments.

Had that probability been correct (it turned out to be a wild underestimate) sheer chance was still a likelier explanation that Lucia de Berk was a serial murderer. (The probability of her shift patterns coinciding by chance was subsequently reassessed as being more like 1 in 25. De Berk’s conviction was overturned in 2010.

Confirmation bias

All observation is theory-dependent: scientists must first have a theory before they can test it: otherwise, how do they know what they are looking for?

Having committed to a bad theory, we tend to seek out, interpret, and best remember information that confirms it. We may over-weight supporting evidence and disregard contradicting evidence. This is what the adversarial system expects.

This is confirmation bias. It has two phases. The first applies in “theory formation”: the man with a hammer is primed to see nails.

Suspicious that strangely-behaving Lindy Chamberlain might have something to hide, Police searched her possessions, tent and vehicle, eventually finding what they believed to be “fetal blood” splashed under the dashboard of her husband’s car.[12] This set the police in a direction. Before long they had constructed an elaborate theory of how Lindy Chamberlain had committed murder.

The second aspect is in theory corroboration. Here evidence which, when taken in the abstract, would tend to “be consistent with” innocence can, with confirmation bias, be presented as damning.

We would expect an innocent mother to behave “oddly” in the aftermath of her infant daughter’s disappearance. We would expect an innocent nurse to lead a healthy social life. We would expect an innocent defendant not to admit his crimes or express remorse for his actions.

But these are exactly the behaviours one might expect of the innocent: things that point away from the defendant’s guilt appear, with confirmation bias, to point towards it.

Staying there

Once we are in the tunnel, there are cognitive biases that prevent us from finding our way out.

Hindsight bias and the reiteration effect

In hindsight, people tend to think an eventual outcome was inevitable or more likely than they might have before it happened. “What is the chance that that nice woman you met at the campsite just now will, in three hours, brutally murder her own nine-week-old infant?” versus “Given that this woman’s nine-week-old infant has disappeared from the campsite, and the police suspect her of foul play, what is the chance that she brutally murdered her own child?”

Through “hindsight bias” we project onto outcomes our knowledge of the observed behaviour in the past, without realising that the perception of the past has been tainted by our knowledge of the outcome.

Once a person becomes a prime suspect, hindsight bias suggests that, upon reflection, she was the likely suspect from the beginning. Evidence is malleable in light of this “realisation”.

There is also a “reiteration” effect. Our confidence in a theory increases the more we hear it, independent of its validity. The longer that police, prosecutors and witnesses live with the conclusion of guilt, the more they become invested in it. Their conclusions become entrenched, and it appears obvious that all evidence pointed to the defendant from the outset. Prosecutors and defenders find it increasingly hard to consider alternative theories of situation.

Randle McMurphy’s dilemma

“Correspondence bias”, or the “fundamental attribution error” automatically attributes observed behaviour to malice without considering other explanations. Our favourite, per Hanlon’s razor, is stupidity:

“Do not attribute to malice things that can just as well be explained by stupidity.”

But it applies just as well to innocent explanations, and alternative guilty ones. It works like this:

The circular correspondence bias model

A strangely prevalent form of circular correspondence bias works as follows:

There is weak circumstantial evidence that X committed a crime.

The (highly unusual) traits of people who commit that sort of (highly unusual) crime are attributed to X (this is a “fundamental attribution error”).

X’s newly-attributed traits are cited as evidence corroborating the existing weak circumstantial evidence. This is circular.

X is now characterised as a “sociopath”, “narcissist”, “attention-seeker”, or even “Munchausen’s syndrome by proxy sufferer”, so is more likely to have committed the highly unusual crime.

Other aspects of X’s behaviour which, but for the allegation, would be normal, now appear to verify the trait. (He socialises normally. He sends condolences. He works extra shifts.) This is, of course, is also circular.

X is convicted on the compelling evidence of the opportunity and his highly unusual and suspicious behaviour.

You might recognise this as the plot driver of Ken Kesey’s One Flew Over the Cuckoo’s Nest.

A worked example: Doctor Bob

So, say there is an unusual cluster of deaths in the geriatric ward of a given hospital. All deaths —eight of them, over six months — coincide with the shift pattern of one doctor, Bob.

This raises a weak but, Bob thinks, easily rebuttable presumption of foul play on Bob’s part. (“Since I didn’t do it, there will be no direct evidence and no strong circumstantial evidence specifically implicating me.”)

Before this statistical correlation, Bob was not under suspicion. Nor was his behaviour unusual: he was a normal young, diligent doctor with an active social life. Nor does his behaviour change after the cluster of deaths.

But the foul play — if that’s what it is — is horrific: someone is systematically murdering little old ladies. Only a psychopath with a stone-cold heart and a narcissistic personality disorder could do that. If it is Bob, he must be a psychopath with a narcissistic personality disorder. This logic is already circular, but the circle is big and conditional enough not to be obvious.

We now start inspecting Bob’s behaviour. We find him to be meticulously tidy. He enjoys socialising, attending a regular salsa class on Wednesdays. He sent condolence cards to the families of several deceased patients.

Now: what could be more indicative of a stone-cold, narcissistic psychopath who has just murdered someone than a fastidiously tidy person — hence, no evidence, right? — who sent a card to the victim’s wife before going out drinking and dancing with his friends? The net is closing in.

A second worked example: Nurse Lucy

How an odd series of coincidences might wind up with a diligent neonatal nurse being handed seven whole-of-life sentences:

There has been an unusually large number of explained collapses. (The thing about random events is that they are clumpy and not perfectly smooth and predictable, but let’s park that).

One explanation for this cluster is that someone is harming babies. (Another is that it could be one or a combination of any number of factors, including a literally infinite set of things we don’t even know about, that could also possibly have prompted this cluster, or indeed that, per step 1, the cluster is simply a truly random variation that was not prompted by anything in particular, but let's park those too)

(hospital administrators) Hey, there's a nurse who was on duty for a *lot* of these collapses. Most of them, in fact.

So it is not beyond possibility that someone *is* deliberately harming babies! Whoah. That’s serious. Let us be on the safe side. Let’s call the police!

Please, Mr Policeman, sir, we have had a spike in unexplained deaths in our ICU and we can’t rule out foul play.

(Mr Police): But this is serious! What makes you think there is foul play?

Well, without wishing to tell tales there is this nurse who keeps turning up like a bad penny. So there is that. But we don’t know: it may be a coincidence.

(Mr Police): Well, let’s have a look at the data! Only a limited number of people have the plausible opportunity to harm babies deliberately — Nurses and doctors in the ICU, pretty much — so we should be able to rule them all out pretty quickly right? Let's have a look at the shift rota to see who was on duty.

(Mr Police). Well lookee here: there is one, just one nurse, who was on duty for pretty much[13] every single one of those collapses. Ladies and gentlemen, I think we have our answer.

(Hospital administrators aside): I always knew she was a wrong ’un.

  1. Christchurch Journalist Martin Van Beynen’s fantastic podcast Black Hands compellingly makes this case.
  2. They might snap into a sudden orgy extreme violence — but this plays out as desperate, meltdown mass murder, not calculated ongoing serial murder, and there is generally no doubt that it is murder and no shortage of direct evidence implicating the accused.
  3. Mental illnesses having a clear medical pathology, not suspiciously made-up ones out of ex- post facto symptoms like “Munchausen by proxy”. See the “circular correspondence bias” discussion below.
  4. New Law Journal: The Trouble With “Sure”
  5. In fairness the crown submitted expert forensic analysis entered that it was specifically infant blood, so you can hardly fault the jury here. You can fault the crown forensics team though: it turned out to be acoustic deadening spray and not blood of any kind!
  6. JC draws upon The Multiple Dimensions of Tunnel Vision in Criminal Cases by Keith Findley and Michael Scott in the Wisconsin Law Review (2006).
  7. Neil Postman, Technopoly: The Surrender of Culture to Technology, 1992.
  8. In the cases the JC has managed to track down, not one involves anyone seeing the accused commit any unequivocal act of harm
  9. There is some debate about expert witness conflicts of interest in criminal law circles but, given how public interest there is in the question, it has had little public exposure as yet: perhaps Post Office Horizon IT scandal and the Letby case will change that.
  10. per Kahneman and Tversky. This work is not without its critics. JC wonders whether the same thing propels the commercial solicitor’s compulsion to over-description.
  11. This sounds properly nuts but it is true. Assume you test 100,000 people. At a 99.9% accuracy rate, you will expect 99,900 correct results, and 100 false ones. But in that sample of 100,000 you would only expect 1 actual case of the condition. So if you are tested and receive a positive result, you have a 1 in 100 chance that it is a true result.
  12. Much later the substance was retested. It turned out to be sound-deadening material containing iron oxide.
  13. Rather embarrassingly, it turns out that the shift rota was erroneous and this nurse was not on duty on every single case, but hey ho.