Template:M intro crime tunnel vision

From The Jolly Contrarian
Revision as of 10:02, 26 July 2024 by Amwelladmin (talk | contribs)
Jump to navigation Jump to search

By tunnel vision, we mean that “compendium of common heuristics and logical fallacies,” to which we are all susceptible, that lead actors in the criminal justice system to “focus on a suspect, select and filter the evidence that will ‘build a case’ for conviction, while ignoring or suppressing evidence that points away from guilt.” This process leads investigators, prosecutors, judges, and defence lawyers alike to focus on a particular conclusion and then filter all evidence in a case through the lens provided by that conclusion.

The Multiple Dimensions of Tunnel Vision in Criminal Cases by Keith Findley and Michael Scott (2006)

Do not attribute to malice what can satisfactorily be explained by stupidity.

Hanlon’s razor

To a man with a hammer, everything looks like a nail.

—Abraham Maslow

Prosecutor’s tunnel vision
/ˈprɒsɪkjuːtəz/ /ˈtʌnᵊl/ /ˈvɪʒᵊn/ (n.)
The collection of biases and cognitive gin-traps that can lead prosecutors — those who “prosecute” a particular theory of the world — to stick with it, however starkly it may vary from available evidence and common sense.

So named because it is often literal prosecutors, of crimes, who suffer from it. This kind of tunnel vision has led to notorious miscarriages of justice where innocent people come to be convicted beyond reasonable doubt notwithstanding clear, plausible and more credible alternative explanations.

It is not just miscarriages of justice: the same tunnel vision motivates ideologies, conspiracies and management philosophy: 360-degree performance appraisals, the war on drugs; the worldwide AML military-industrial complex; the batty dogma of outsourcing are all cases where those “prosecuting” the outlook stick with it notwithstanding the weight of evidence that the theory is at best useless, and may lead to the opposite of the desired outcome.

An asymmetry of evidence

At the end of the “prosecutor’s tunnel” are clear but simplistic and often misleading models of a messy world. We are pattern-matching, puzzle-solving animals. We are drawn to neatness. We and resile from intractability as it indicates weakness: that our frail human intellect has been defeated by the ineffable natural order of the cosmos.

So it is that the simplicity of the prosecutor’s case, especially when backed by data, can crowd out our usually strong attachment to common sense, intuition and inference.

We have built our institutions to be vulnerable to this crowding out: we allow “data” as evidence but not “intuition”. Hence an asymmetry: data — evidence — is better at supporting a case for what did happen than for what did not.

Criminal law proceeds upon data and the weight of evidence. Intuition’s role subsists mainly in the presumption of innocence. A prosecutor must prove guilt; the accused need not prove anything: she cannot be expected to explain what happened for the simple reason that, if she did not do it, she has no better idea what happened than anyone else. The jury, we all hope, lean on their intuition when conjuring doubts.

Interlude: difficult cases from down under

Three notorious cases from the antipodes illustrate what is at stake here when intuition and common sense are relegated behind “data”.

A ring of dust around Ayers Rock

Lindy and Michael Chamberlain and their three children were camping at Ayers Rock in central Australia in August 1980.[1] The adults were with other campers around a campfire when Lindy heard a disturbance near the tent where her infant daughter Azaria was sleeping. She went to check on the baby and thought she saw a dingo running out of the tent. When she got to the tent, the child had vanished.

Lindy raised the alarm at once, but Azaria was never found.

Dingo attacks on humans at the time were rare, and the police believed Lindy was behaving strangely. They regarded her “dingo” explanation as absurd.[2] They, and quickly thereafter the public, concluded that Lindy had murdered and disposed of her baby.

They built their case from the little positive evidence they had: Lindy’s absence from the campfire gave her an opportunity; her strange religious beliefs — the Chamberlains were Seventh-Day Adventists — gave her a motive; what appeared to be spattered infant blood in the footwell of the Chamberlain’s car provided forensic evidence and Lindy’s odd behaviour when interviewed provided corroboration.

Many aspects of the police case were highly implausible: logistically, it was almost impossible for Lindy to have murdered Azaria in the way proposed — with blunt scissors — and disposed of the body and all evidence in the five minutes available to her. Never mind how unlikely it was for a mother — let alone a devout Christian mother — to murder her own infant in cold blood.

Nevertheless, in 1982, Lindy Chamberlain was convicted of Azaria’s murder and spent three and a half years in prison before Azaria’s matinee jacket was found, four kilometres from the campsite, at the entrance to a dingo lair. Lindy was released and pardoned but her conviction was not finally quashed until 1992.

The “blood spatter” in the footwell of the Chamberlains’ Holden Torana turned out, much later, to be a standard sound-deadening compound applied during the car’s manufacture.

Satanic panic in the Garden City

In 1991, Peter Ellis, a childcare worker at a daycare centre in New Zealand was charged with horrific abuse of several preschool children in his care.[3] Police alleged, on the children’s own evidence that, among other things, Ellis abducted the children en masse during the day and subjected them to bizarre rituals and acts of unthinkable cruelty and violence.

In total, one hundred and eighteen children were interviewed by police and social workers.

In 1993 Ellis was convicted on 16 counts of child sex abuse against seven children. They discarded evidence from children who did not report abuse, and also set aside patently impossible claims, meaning that the evidence put before the court — and disclosed to the defence — appeared more compelling than it would have if viewed in the wider context. It transpired later that the techniques the police and social workers used may well have encouraged the very young children to make their stories up.

But none of the allegations were true.

Ellis maintained his innocence throughout and continued to fight for his name to be cleared, but died in 2019. New Zealand Supreme Court finally quashed all remaining convictions in 2022 citing a substantial miscarriage of justice due to unbalanced evidence and contamination of the children’s evidence.

Murder in the family

On the morning of June 20,1994 twenty-two-year-old David Bain returned from his paper round at 6:45am to find his whole family had been shot dead. [4] He did not discover this immediately: it was still midwinter dark at that hour, deep in the Southern Hemisphere. Without switching on a light David first went downstairs to put on a load of laundry. He later told police it was so dark he did not notice his father’s bloodstained clothes on the machine, and inadvertently washed them with his own, obliterating key evidence.

Returning upstairs, David discovered his father Robin lying in the living room beside a 22 rifle with a bullet wound in his head. Quickly thereafter he found the bodies of his mother, two sisters and youngest brother, who appeared to have put up some kind of fight before being overcome.

A note typed on the family computer, apparently by David’s father Robin, said, “You were the only one who deserved to live.” David placed an agitated call to emergency services. It was recorded, and remains a part of the public record.

David told police his father, motivated by a troubled relationship with the family, must have murdered them all before turning the gun upon himself.

Based on circumstantial evidence including bloodstains on his own clothing and spectacles, fingerprints on the murder weapon, minor bruises and abrasions consistent with a struggle with his brother, and a lack of any evidence pointing to his father, David Bain was charged with all five murders and convicted on all counts.

David maintained his innocence.

In 1996 Joe Karam, a former New Zealand rugby international, became involved after reading a newspaper article about university students raising money to fund an appeal. Karam became fascinated by the case, and was persuaded of David’s innocence. He brought significant publicity to David’s cause and championed his innocence, uncovering shortcomings, inconsistencies and oversights in the police investigation and in the handling of evidence.

Eventually, David’s case made it to the Privy Council where, twenty years after the original trial, the court quashed Bain’s convictions and ordered a retrial. The second jury was not persuaded of David’s guilt and he was acquitted of all charges. He remains a free man.

Narrative biases

These three cases illustrate the problem of “circumstantial evidence” cases: where there is no direct evidence of the alleged crime, one tends to start with a hypothesis and then fit whatever forensic evidence you have into it. There is a kind of tunnel vision at play. This is so regardless of whether you are convinced of the defendant’s guilt or innocence.

All kinds of odd theories circulated owing to the Chamberlains’ unusual religious beliefs — they were Seventh Day Adventists — and nationality — they were New Zealanders! — But that anyone could bring themselves to do such a thing — never mind the child’s own mother — is hard to imagine. The idea that a child’s own mother would cold-bloodedly murder her nine-week-old baby without warning is, in the abstract, preposterous. And this is before considering the practical difficulties with what was alleged. To even set up this as a hypothesis there must be a solid grounds supporting this highly implausible scenario— if not credible eye-witness evidence, then documented psychiatric instability, a history of volatile violent temperament — or at least a compelling motivation that could overthrow the powerful human instinct in almost all people to protect vulnerable infants.

There must be no plausible alternative explanation. But Lindy Chamberlain provided one. A dingo snatching the child might have seemed unlikely, but even with no recorded cases, it was not half as unlikely as a mother killing her own infant with a pair of blunt scissors in the footwell of a car a few yards from group of people sitting around a camp fire.

Likewise, for anyone to do what Peter Ellis was alleged to have done was extraordinarily unlikely. To do it with adult witnesses anything, no-one noticing the children go missing for, apparently, hours on end, for none of the children’s supposed injuries to leave any traces at all, made it even less likely. And there was a plausible alternative: the events did not happen at all. All that was required for that to be true was the children, perhaps at the unwitting promoting of adults already in the grip of prosecutor’s tunnel vision, to make things up. This is not unusual behaviour. By comparison with untraceable ritual satanic abuse, children exercising their imaginations to please adults does not seem unlikely.

It is also true that mass murder, and even more so familicide, is extremely rare and, therefore, in the absence of prior evidence, a highly unlikely explanation for the deaths of the Bain family. But there is no question the Bain family were murdered. The only open question was by whom. On David Bain’s own theory of the case only two people could have done it: his father and himself. It was therefore definitely familicide: the prior improbability of familicide is beside the point. The posterior probability that David was responsible for familicide changes with this information. There is now a 50% chance he was responsible, before considering any further evidence. And the evidence all pointed to David. To not be the murderer, on his own evidence, David would have had to be extremely unlucky — forgetting to turn on the light, accidentally disposing of exculpatory evidence, having incriminating injuries he could not explain — while no such evidence pointed to Robin. David’s defence exhibited its own kind of tunnel vision focusing on the possibility that each piece of evidence might have an innocent explanation. To be clear, there were procedural inadequacies in the police case and David Bain has been acquitted at law, but as a piece of judicial procedure, the comparison between Bain’s case and those of Ellis and Chamberlain is stark.

Standards of proof

The prosecution’s standard of proof is, in theory, high: beyond reasonable doubt. It isn’t clear that quite achieves what it is meant to. Courts have moved to dumb it down: that time-honoured phrase has been discarded and juries are directed to convict only if they are “sure”. While this is meant to mean the same thing, not all are persuaded that is how juries understand it.[5] And there is some reason to think that juries start with a presumption that the accused is guilty at least to the balance of probabilities: assuming the police acted in good faith, why else would the defendant be in the dock?

But a scenario where tendentious data may be introduced in support of guilt but there is a total lack of “data” supporting exoneration — only the intuition that it seems highly unlikely that such a person should do such a thing — may lead to that confusion. Lindy Chamberlain was convicted of her own daughter’s murder, with a pair of blunt scissors, on the evidence of bloodlike spatter in the footwell of her husband’s car. The intuition that a sane mother is most unlikely to brutally murder her own nine-week-old child at all, let alone with an improvised weapon and without warning or provocation was not before the court. Somehow the jury was persuaded not just that she did it, but that there was no plausible alternative explanation.

JC draws upon The Multiple Dimensions of Tunnel Vision in Criminal Cases by Keith Findley and Michael Scott in the Wisconsin Law Review (2006) and Robert Cialdini’s Persuasion. To some extent also the madness of crowds and Jon Haidt’s The Righteous Mind. The lesson we draw is that we are not as rational as we like to think and data is never the whole story.

It may describe all view-forming of a “conviction” kind. They are like political and religious views in that, once they take root, they are not easily displaced.

The “wrongful conviction” cases are bracing because, with hindsight, a better narrative and having taken a different cognitive path to the prosecutors, it is so hard to understand how they got there, or why they persisted with such plainly untenable views. If we treat prosecutor’s tunnel vision as a variety of political or even religious conviction, we can see better how “prosecutors” can be so energetic in their maintenance of a bad model. It perhaps explains the gruesome in-house performance in the Post Office Horizon IT scandal.

Prosecutors need not be literal prosecutors: campaigners for innocence, and conspiracy theorists suffer at the hands of the same collection of cognitive traps. Both sides of the public conversation about Lucy Letby are similarly afflicted with tunnel vision: hence, allegations of conspiracy from both sides.

The three phases of tunnel vision

Tunnel vision has three phases: first, the background conditions arise to make us vulnerable to tunnel vision in the first place; secondly, those that push us into a given tunnel; the third are those cognitive artefacts that keep us there.

Call these “setting out”, “getting there” and “staying there”.

In order of appearance:

Background

Certain dispositions, biases and miscellaneous psychological tics come together to create the conditions for tunnel vision to swamp an earnestly-held narrative:

The “anchoring” effect

When making decisions we tend to “anchor” our expectations on the first piece of information we get, and then recalibrate as we go, not against some abstract sense of rectitude, but by reference to the anchor. Our initial impression can therefore disproportionately influence the model we draw and our later assessment of responsibility.

This is the theory behind the “discount sticker” in a car showroom: You are already getting a great deal, and you haven’t started haggling![6]

Overconfidence in own expertise

We subject matter experts tend to overestimate our ability to judge subtle problems, especially those adjacent to, but not within, our expertise. We then over-emphasise our experience when forming conclusions — this is an entry condition to any professional calling — even where it might be only a relatively small part of the story.

Where we are overly confident in our essential rectitude we are less likely to consider alternative models, explanations, theories or even evidence.

“To a man with a hammer...”

You don’t enlist in the army hoping never to shoot a gun.

The police show up for work to detect crime. Prosecutors to prosecute it. They are primed this way, as are we all: to be useful and for their role in the great cosmos to be important and to make a difference.

Saying, “well, sorry, but there’s nothing to see here, folks”, packing up and going home is not professionally rewarding. That is not what gets us out of bed.

We tell ourselves archetypal stories: the dogged sleuth who smelled a rat and stuck at it over the objections of naysayers and superiors — threatened with a career issuing parking tickets — and overcame the shadowy machinations of unseen malign forces.

There are no archetypes about conspiracy-obsessive geeks who hound innocent prisoners to their graves.

Role pressure

Law enforcement agencies will be under tremendous pressure to get results. Curiously, this may lead to missed opportunities: notoriously, West Yorkshire police repeatedly missed the Yorkshire Ripper despite interviewing him several times because he didn’t match the profile they had built, which was based on letters from a hoaxer.

Getting there

Once you are safely anchored with your hammer and have started wandering around the house looking for nails, there should still be scope for falsification of your operating theory. But again, psychological biases can override the dispassionate application of cool logic.

Extrapolation

Especially vulnerable are subject matter experts. We are natural problem-solvers and model builders and we will easily slip beyond our brief and intrude into matters where we have little experience. A statistician can give us a compelling account of the Bayesian probabilities, but when she strays into the causes of elevated insulin readings in a sample containing an apparently significant cluster of readings. Likewise, a medical expert may opine that the insulin readings are elevated beyond what would usually be expected, but the forensic question of who or what caused the excess insulin levels is a question of forensics and not something a diabetes specialist has any better idea about than anyone else.

Precedent: bad heuristics

The greater the expertise, the more grooved the expectations, the stronger the heuristic, the greater our temptation to take that shortcut and presume that this is “one of those” cases. Often the subject matter experts model will be right: if so this heuristic is a handy, efficient shortcut. In the rare case that presents one way but is an exception, it can be dangerous. Heuristics are excellent, efficient devices (as Gerd Gigerenzer notes, they help us to catch balls without needing to perform differential equations), but when the model is wrong they can lead to trouble.

Base rate neglect

“Base rate neglect” — also known as the prosecutor’s fallacy — is our natural tendency of to ignore the “base rate” of a phenomenon in the population and instead focus on “individuating” information that pertains to the specific case in front of us.

If, statistically, a certain test has a 1/1000 chance of yielding a “false positive” for a given illness, but the general prevalence of that illness in the population is 1/100,000 then for every one true positive result, we should still expect 100 false positives. Worth remembering if you are diagnosed with a rare illness!

The same principle holds for criminal offending: if there is a 1/342,000,000 chance that “so many suspicious deaths could have occurred with the same nurse on duty by sheer chance” it may seem that the fact one nurse was in fact on duty for all those suspicious deaths is damning. But this is to ignore the base rate: How many health professionals are there in the world with no criminal history, motive or mental illness who murder multiple patients?

For the odds to be even with sheer chance — that is the “balance of probabilities”, remember, a long way short of “beyond reasonable doubt” — there would need to be twenty-three. There are not twenty-three such serial killers in the world.

Yet “one in three-hundred and forty-two million” was the figure that convicted Dutch nurse Lucia de Berk of serial murder. Even if this figure had been correct, sheer chance was still the far likelier explanation.

It turned out the statistics were in any case wrong: the probability of her shift patterns coinciding by chance was reassessed as being more like one in twenty-five. For the odds of serial murder to be even with that, there would need to be 320 million hospital serial killers.

Observers might note the similarities between this case and a British case that is currently in the news.

Confirmation bias

All observation is theory-dependent: scientists must first have a theory before they can gather evidence to test it: otherwise, how do they know what evidence to look for?

Having picked a bad heuristic, we tend to seek out, interpret, and best remember information that confirms it. We may overweight evidence that supports our theory and disregard or minimise anything that contradicts it.

Convinced that Lindy Chamberlain had murdered her infant daughter Azaria, Police searched high and low for blood, eventually finding some splattered over the footwell of her husband’s car. This was crucial evidence in her conviction turned out to be sound-deadening material containing iron oxide that was present on all cars of that model.

Selective information processing

Focusing on certain pieces of evidence while ignoring others. Prosecutors might only present evidence that strengthens their case and neglect exculpatory evidence that could help the defence. Peter Ellis’ prosecutors interviewed twenty or more children. Some gave plainly preposterous accounts of what went on. They were not called to give evidence and their statements were considered irrelevant and therefore were not all made available to the defence.

Groupthink

Thinking or making decisions as a group in a way that discourages creativity or individual responsibility: Prosecutors might conform to the prevailing opinion within their office, stifling dissenting views and critical analysis of the case. See also Dan Davies’ idea of “accountability sinks”. Post Office Horizon IT scandal is perhaps the archetypal example of groupthink.

Reductionism

Drilling deep into technical details that, by themselves, and shorn of all context, seem to lead to one conclusion — especially one you are already anchored to — notwithstanding the wider picture making the hypothesis unlikely. Especially in cases with no direct evidence, there is a great risk of this.

Prosecutors’ focus on “blood” sprayed up in the footwell of the Chamberlains car led them to a theory that Azaria was murdered there, despite no evidence supporting the theory, and quite a lot — principally, the lack of time for Lindy Chamberlain to do any such thing. The prosecution case started with “murder in the car” as the anchoring evidence, and hypothesised a whole story around it, for which there was no supporting evidence but also no contradiucting evidence, so it was “possible”. There is a lot of this in the Lucy Letby case, on both sides.

Staying there

Hindsight bias and the reiteration effect

In hindsight, people tend to think an eventual outcome was inevitable, or more likely or predictable, than they might have before it happened. “What is the chance that that nice woman you met at the campsite just now will, in three hours, brutally murder her own nine-week old infant?” versus “Given that this nine-week old child has disappeared from the campsite, and the police suspect the mother of foul play, what is the prospect that her mother brutally murdered the child?”

Through “hindsight bias” we project new knowledge (of actual outcomes) onto our knowledge of the past (observed behaviour), without realising that the perception of the past has been tainted by the subsequent information.

Once a person becomes a prime suspect and prosecutors arrive at an outcome in their own determination of who they believe is guilty — hindsight bias suggests that, upon reflection, the suspect was the inevitable and likely suspect from the beginning. Evidence is malleable in light of this “realisation”.

This is compounded by a “reiteration” effect. Our confidence in a theory increases the more we hear it, independent of its truth or falsity. The longer that police, prosecutors and witnesses live with a conclusion of guilt, the more entrenched their conclusion becomes, and the more obvious it appears that all evidence pointed to that conclusion from the very beginning. This “reiteration effect” makes it increasingly difficult for police and prosecutors to consider alternative perpetrators or theories of a crime.

Outcome bias

Like hindsight bias, “outcome bias” involves projecting subsequent “outcomes” onto observed behaviour, only about the quality of a suspect’s decision. Subjects are more likely to judge as bad a suspect’s decision to operate when they are told the patient died during surgery than when told the patient survived. This is the operator error presumption from Sidney Dekker’s The Field Guide to Human Error Investigations

Sunk cost fallacy

The inclination to continue an endeavour once money, effort, time or credibility has been invested, even when new evidence suggests the defendant might be innocent. (see also commitment when talking about persuasion)

Antidotes

Q: How many psychiatrists does it take to change a light bulb?
A: Just one; but the light bulb really has to want to change.

Some strategies to counteract the effect, but the predominant one is to want to keep an open mind.

Hanlon’s, and Otto’s razor

“Do not attribute to malice things that can just as well be explained by stupidity.”

Hanlon’s razor

Don’t assume malice where stupidity will do; likewise, per Otto’s razor, don’t attribute to virtue something that could equally be attributed to self-interest; or to skill something that could equally be attributed to dumb luck.

  1. A Perfect Storm: The True Story of the Chamberlains is a fabulous accout of the whole affair.
  2. A common schoolyard joke at the time: Q: What is the ring of diust rising around Ayers Rock? A: The dingoes doing a lap of honour.
  3. Conviction: The Christchurch Civic Creche Case
  4. Black Hands: A Family Mass Murder
  5. New Law Journal: The Trouble With “Sure”
  6. Anchoring is relatively well documented. Kahneman and Tversky asked subjects to to spin a wheel of fortune and, after spinning, to estimate the percentage of African nations in the UN. Those who landed on a higher number gave significantly higher estimates than those who landed on a lower number.