Template:M intro crime tunnel vision

From The Jolly Contrarian
Jump to navigation Jump to search

By tunnel vision, we mean that “compendium of common heuristics and logical fallacies,” to which we are all susceptible, that lead actors in the criminal justice system to “focus on a suspect, select and filter the evidence that will “build a case” for conviction, while ignoring or suppressing evidence that points away from guilt.” This process leads investigators, prosecutors, judges, and defence lawyers alike to focus on a particular conclusion and then filter all evidence in a case through the lens provided by that conclusion.

The Multiple Dimensions of Tunnel Vision in Criminal Cases by Keith Findley and Michael Scott (2006)

Prosecutor’s tunnel vision
/ˈprɒsɪkjuːtəz/ /ˈtʌnᵊl/ /ˈvɪʒᵊn/ (n.)
The collection of biases and cognitive gin-traps that can lead prosecutors — those who “prosecute” a particular theory of the world — to stick with it, however starkly it may vary from available evidence and common sense.

So named because it is often literal prosecutors, of crimes, who suffer from it. This kind of tunnel vision has led to notorious miscarriages of justice where innocent people come to be convicted notwithstanding clear and plausible alternative explanations for their ostensible “crimes”.

The same tunnel vision also motivates ideologies, conspiracies and management philosophy: 360-degree performance appraisals, outsourcing, the war on drugs; the worldwide AML military-industrial complex: are all cases where those “prosecuting” the theory stick with it even though the weight of evidence suggests it does not work and may even be counterproductive.

The “prosecutor’s tunnel” begins with clear but simplistic — misleading — models of a messy world. Humans have a weakness for these: we are pattern-matching, puzzle-solving animals. We are drawn to neatness. We resile from intractability as it indicates weakness: that our frail human intellect has been defeated by the ineffable natural order of things.

An elegant hypothesis

Sometimes the sheer elegance of a prosecutor’s case can crowd out common sense and the basic intuition that this cannot be right.

We have built our legal institutions to be vulnerable to this kind of crowding out. Criminal law proceeds upon data and the weight of evidence but disallows “intuition”. Hence, there is an asymmetry: evidence is better at saying what did happen than what did not. This is especially so where there is no direct evidence that the defendant actually did what she is accused of.

Circumstantial evidence does not directly implicate a defendant but is consistent with the prosecution theory. It accumulates: if there is enough of it, and none points away from the defendant, it can tell us something. But, correlation and causation: evidence that is “consistent with” a prosecution theory does not prove it: that JC owns a bicycle is consistent with his competing in the Tour de France; it does not make him any more likely to do it. Evidence can look more meaningful than it is. This is where intuition ought to be able to help us.

As it is, intuition’s role is relegated to underpinning the presumption of innocence. A prosecutor must prove guilt; the accused need not prove anything: she cannot be expected to explain what happened for the simple reason that and innocent person should have no better idea about it than anyone else. The jury, we hope, leans on its intuition when conjuring doubts.

Experience tells us otherwise. In what follows, JC takes three notorious cases from the antipodes to see what can happen when, with no direct evidence, those arguing the case become afflicted with tunnel vision, and intuition and common sense are relegated behind “data” and circumstantial evidence. Then we will look at what causes this condition.


Narrative biases

These cases illustrate the problem of relying on circumstantial evidence: with no independent direct evidence, one tends to start with a hypothesis and fit whatever secondary and forensic evidence you have into it, discarding whatever does not fit. This is the classic tunnel vision scenario. It can afflict those who would defend suspects just as firmly as those who prosecute them.

All kinds of theories circulated owing to the Chamberlains’ unusual religious beliefs and “odd behaviour” in the aftermath of Azaria’s disappearance. But devout Christianity is hardly a solid prior indicating a tendency to murder. Nor is “odd behaviour” in the aftermath of a mother’s most extreme psychological trauma. Who would not behave oddly in those circumstances?

That anyone could bring themselves to cold-bloodedly murder a nine-week-old baby is hard to imagine. Statistically, it is highly improbable. That the child’s own mother would is, in the absence of compelling evidence, preposterous. To even start with this theory you must surely have compelling grounds to believe it over all other possibilities — if not credible eye-witness evidence, then a documented history of violence, behavioural volatility or psychiatric illness grave enough to overthrow the strong human instinct to protect vulnerable infants. Lindy Chamberlain had no such history.

If there is any plausible alternative explanation for the baby’s disappearance, there must have been a reasonable doubt. It need not be more probable than the prosecution case: just not out of the question. Lindy Chamberlain provided one: a dingo snatching the child might have been unprecedented, but it was possible. There were dingoes in the area. They are predators. They are strong enough to carry away a human infant. A dingo was no less likely than a new mother noiselessly murdering her own infant just yards from a group of independent witnesses. That ought to have been the end of it.

Likewise, what Peter Ellis was alleged to have done is extraordinarily improbable. There are few documented cases of ritualistic abuse on that scale anywhere in the world. There are none in New Zealand. For such a thing to have happened without any prior evidence of such behaviour, with no adult witnesses, no one noticing the absent children and for none of the children to bear any trace of their supposed injuries makes it even less likely.

And there was a plausible alternative: nothing happened at all. All that was required for that to be true was for preschool children, perhaps at the prompt of interviewers already in the grip of prosecutor’s tunnel vision, to make things up. By comparison with “untraceable, unwitnessed, wide-scale ritual satanic abuse”, “children exercising their imaginations to please adults” is not improbable.

It is different for David Bain. While it is true that familicide is extremely rare and, therefore, absent prior evidence, highly improbable, there is no question that the Bain family were murdered. The only question was by whom.

On David’s own theory, only two people could have done it: his father and himself. It was, therefore, definitely familicide: the abstract improbability of that explanation is therefore beside the point. The probability that David was responsible is therefore greatly higher: before considering any further evidence there is a 50% chance he was responsible.

And a lot of the further evidence pointed in his direction. To not be the murderer, on his own evidence, David would have been extremely unlucky — forgetting to turn on the light, inadvertently disposing of exculpatory evidence, having incriminating injuries he could not explain — while no such evidence pointed to Robin. David’s defenders had their own tunnel vision, focusing narrowly on the provenance of each piece of incriminating evidence, identifying formal shortcomings in its value as evidence: questioning the manner of its collection, the chain of custody, raising possibilities of innocent explanations without evidence to support that alternative, and disregarding the wider context of the whole case.

Now, David Bain was acquitted of all charges. On the evidence, the jury could not rule out the possibility that Robin Bain was responsible. Not being satisfied beyond reasonable doubt that David was the perpetrator, he was correctly acquitted at law. But it remains likely that David was the perpetrator.[1] As a piece of judicial procedure, the comparison between Bain’s case and those of Ellis and Chamberlain is stark.

Tunnel vision and circumstantial evidence

Where there is reliable direct evidence — eyewitnesses, recordings, and causative links between a suspect and the allegation — there is little need for inference; the evidence speaks for itself. But cases comprised predominantly of circumstantial evidence — that therefore depend on inferential reasoning — are vulnerable to tunnel vision because the complex of cognitive biases that make up prosecutor’s tunnel vision affect the process of inference.

Upstanding citizen turns master criminal. Does well.

Prosecutor’s tunnel vision cases often involve hitherto law-abiding citizens suddenly committing fiendish crimes without warning, explanation or motive.

Now JC is, ahem, told that committing violent crime without leaving any incriminating evidence is extremely hard. Especially in a controlled environment like an infants’ daycare centre or a hospital.

To be sure, serial criminals can operate in these environments but they will need to be good: meticulous in their preparation and method. Over time, they will hone their techniques and perfect a modus operandi, acquiring a ghoulish sort of expertise in murder: killing patients in a closely monitored, controlled environment populated by trained experts hardly lends itself to opportunistic, freestyle offending. Hospitals, in particular, overflow with specialists who can detect subtle clues that ordinary laypeople — and burgeoning criminals learning their craft — have no idea about.

As with any complicated discipline, one learns as one goes. We should not, therefore, expect “beginners” to perform like master jewel thieves, slipping in and out, striking in the dark and leaving no trace. They will blunder. They will be careless. They will leave evidence. They will slip up, leave giveaways and clumsily trigger red flags. From new criminals, we should expect “smoking guns”.

So if a strange confluence of events is accompanied by no smoking pistol, this too has some prior probability value. It does not exclude the possibility of foul play, but it does make it less likely.

People do not often flip, overnight and without warning, from conscientious citizens to compulsive criminals. If they did, we would notice it.[2] When hitherto law-abiding people do slide into criminality, there is generally motivation, a history of antisocial behaviour, identifiable psychological trauma, drug dependency, observed personality change over time or diagnosed mental illness.[3] Often all of these things. (Let us call them “criminal propensities”.)

The absence of any of criminal propensities in a suspect’s makeup should reduce the “prior probability” of foul play by that suspect. As we will see, “circular correspondence bias” can take such a lack of criminal propensity and somehow invert it into confirmation.

Where a crime has certainly been committed, this goes only to who the perpetrator is. There may (as in David Bain’s case) be only a small universe of credible suspects. If all “possible suspects” have the same lack of criminal propensity, it will count for little. But if the universe of “potential suspects” is large — or if it is plausible that no crime was committed at all — an individual’s lack of any criminal propensity should tell us something “circumstantial”.

Neither Lindy Chamberlain nor Peter Ellis had any criminal propensity and both cases there was a plausible alternative explanation. For David Bain it was different.

Burden and standard of proof

The burden of proof is a different thing to the standard of proof. The burden is who has to prove their case: this falls squarely on the prosecution. The defence is not required to prove anything, least of all the accused’s innocence.

But there is tension between that crystalline legal theory and the practical reality: it is in the defendant’s interest that someone casts doubt into jurors’ minds. Since the Crown plainly won’t be doing that, the defence must either rely on jurors to confect plausible doubts by themselves, or it must plant some doubts there. It is a brave defence counsel indeed who puts her client’s future in the hands of a jury’s imagination and capacity for creative thought.

All the same, the prosecution’s standard of proof — what it must do to discharge its burden of proof — is, in theory, extremely high. Courts have dumbed down the time-honoured phrase beyond reasonable doubt: these days, juries are directed to convict only if they are “sure”. This is meant to mean the same thing, but not everyone is persuaded that is how juries understand it.[4]

There is some reason to think that juries start with an ad hoc presumption that any defendant put before them is somewhat likely to be guilty: if the police were competent and acted in good faith, why else would the defendant be in the dock?

So where there is only tendentious data supporting a defendant’s guilt but a total lack of “data” supporting her innocence — what evidence could there be that you did not do something that did not happen? — there are grounds for confusion here, and there is good evidence that juries do indeed get confused.

Lindy Chamberlain was convicted of her own daughter’s murder, with a pair of blunt scissors, on the circumstantial evidence of what looked like blood sprays in the footwell of the family car.[5]

Evidence supporting the intuition that “a sane mother is most unlikely to brutally murder her own nine-week-old child at all, let alone with an improvised weapon and without warning or provocation” was not before the court. What evidence could there be of that? Somehow the jury was persuaded not just that she did murder her child, but that there was no plausible alternative explanation for the child’s disappearance. This was largely thanks to the strange collection of cognitive biases to which the prosecution had succumbed.

The three phases of tunnel vision

So what is “prosecutor’s tunnel vision”, then and how does it come about?[6] It is a sort of “emotional conviction” to an (as-yet) unproven explanation. We become personally invested in a narrative; the consequences — and personal costs — of rejecting the conviction are great, and grow the more we commit to the position.

Tunnel vision has three phases: first, there must be enabling background conditions that make us vulnerable to tunnel vision; second, there are pathways into a given tunnel; third, there are cognitive biases that keep us there.

The lessons are two-fold:

We are not as rational as we like to think and
Data is never the whole story.

Background conditions

Certain dispositions, biases and miscellaneous psychological tics come together to create the conditions for tunnel vision to swamp an earnestly-held narrative:

Mainly circumstantial evidence

In that tunnel vision is a collection of cognitive biases infecting how we draw inferences, it usually arises where there is no direct evidence of wrongdoing. Where there are reliable eyewitnesses there is little need to infer what happened: someone saw it. Where there are no witnesses and the case depends on circumstantial evidence — even more so where it isn’t clear there was a crime at all — the jury must infer what happened from purely circumstantial evidence.

There was no clear evidence Azaria Chamberlain was dead, let alone murdered: she was simply missing. Had a reliable witness seen a dingo carrying her away — even Lindy Chamberlain did not claim to have seen that — there would be little scope for inference about the significance of the red-brown material spattered under the dashboard of the Chamberlain’s car. (The coroner’s damning report to the crown prosecutor in the Chamberlain case is a chilling example of tunnel vision.)

Information glut

The fact is, there are very few political, social, and especially personal problems that arise because of insufficient information. Nonetheless, as incomprehensible problems mount, as the concept of progress fades, as meaning itself becomes suspect, the Technopolist stands firm in believing that what the world needs is yet more information.[7]

The more circumstantial evidence there is, the more scope for inference, and the more fantastical narratives one can draw. If the alleged crime occurs in a tightly-controlled environment designed to generate technical data and specialist information, and where deep subject matter experts are on hand to observe and analyse that information in retrospect, conditions are ripe for tunnel vision.

This is exactly the scenario in which the healthcare serial murder cases arise. The alleged crimes, though rarely witnessed, take place within a carefully controlled environment[8]. Access is monitored by CCTV and controlled by swipecards and elaborate security systems. Detailed medical protocols govern and track the storage and dispensation of medicines. Sophisticated equipment — machines that go “ping” — monitor patients’ vital signs around the clock. Nurses, orderlies, consultants and doctors constantly mill around at all hours, doing ward rounds, checking in on patients and generally keeping an eye out for signs of trouble.

Though these systems seem incapable of capturing direct evidence of wrongdoing, they still generate a colossal amount of very “scientific” medical and digital data. This is capable of being analysed and framed to support — to be “consistent with” — any number of different and frequently contradictory inferences and theories of the case.

Expert overreach

To a man with a hammer, everything looks like a nail.

—Abraham Maslow

It is hardly news that existing knowledge and beliefs shape what we see and how we see it. Prosecutors (and defenders) hold pet theories about human behaviour just likeanyone else.

You don’t sign up for the army if you hope not to shoot a gun. Nor do you join the police hoping to never find crime, nor take a crown warrant if you don’t expect to prosecute. Those involved in prosecution are primed this way, as are we all: to be useful; for their roles to be important and to make a difference.

This is no less so for expert witnesses. They are incentivised to support the cases on which they are engaged.[9] Yet subject matter experts can overestimate their ability to analyse and judge subtle problems, especially those in fields adjacent to, but not directly within, their expertise. They may over-weight the overall significance of matters that do fall within their expertise against those that do not. They are less likely to consider alternative models, explanations, theories or evidence that de-emphasise their expertise, let alone theories of the case that contradict it.

This kind of expert overreach is germane for the “healthcare serial murder” cases. Human biology is, in the technical sense, complex. There is much about it that even experts do not yet, and may never, know. Patients may have conditions that are never tested for, and therefore never detected, before or after death. They may have conditions as yet unknown to medical science, in which case they would not have been revealed by established tests in any case.

Low base rate

Furthermore, when we are dealing with extremely rare events any statistical analysis is vulnerable also to base rate neglect. “Healthcare serial murderers” (whether real or imagined) are extremely rare. So, Q.E.D., are the “innocent” causes with which they may be confused: if the “plausible innocent cause of death” explanation is not rare, then there is little chance it will be confused with serial murder. likelihood of a malicious explanation is therefore much lower</ref> — As to which, see below.

Getting there

Once you are anchored, equipped with a hammer and have started wandering around the house looking for nails, there should still be scope to falsify your operating theory. But again, psychological biases can override the dispassionate application of cool logic.

Extrapolation and bad heuristics

Subject matter experts are prone to extrapolate. Humans are natural problem-solvers. We build models of the world by habit. It is easy to slip beyond our range of reliable experience and form theories about matters where we have little expertise. A statistician can give a compelling account of the Bayesian probabilities but the forensic question of who or what caused the excess toxins is a question of forensics, not statistics.

The greater the expertise, the more “grooved” the expectations, the stronger the heuristic, the greater the temptation to take shortcuts and presume this is “one of those” cases, relying on information that is most readily available or recent in their minds, rather than conducting a thorough investigation. This can lead to overestimating the importance of certain pieces of evidence

Base rate neglect

Lawyers are not natural statisticians. “Base rate neglect” — the “prosecutor’s fallacy” — is the natural tendency to ignore the “base rate” of a phenomenon in the population and instead focus on “individuating” information that pertains to the specific case in front of us.

We should be careful with statistics about extreme events.

Say a certain medical test gives a correct result 999 times out of 1,000 and only yields one “false positive” result in a thousand tests. It is tempting to assume that a positive result is for such a reliable test will be conclusive. But if the general prevalance in the population of the condition it is testing for — the “base rate” for the illness — is only one in 100,000 then for every single true positive result, we should still expect 100 false positives. Fully 99% of positive tests will be false. [10]

The same principle holds for criminal offending: if there is, say, a 1/342,000,000 chance that “so many suspicious deaths could have occurred with the same nurse on duty by sheer chance” and one nurse was in fact on duty for all those suspicious deaths, it may seem utterly damning. But this is to ignore the base rate of how many healthcare serial killers are there likely to be in the world?

On a planet with eight billion people, for the odds of our nurse being a serial killer to be as high as 1/342,000,000 there would need to be twenty-three. There do not appear to be that many hospital serial killers in the last generation.

Yet “one in three-hundred and forty-two million” was the figure that convicted Dutch nurse Lucia de Berk of serial murder. Had this estimate been correct (it turned out to be a wild underestimate) sheer chance was still a likelier explanation.[11]

Confirmation bias

All observation is theory-dependent: scientists must first have a theory before they can gather evidence to test it: otherwise, how do they know what evidence to look for?

Having picked a bad heuristic, we tend to seek out, interpret, and best remember information that confirms it. We may overweight evidence that supports our theory and disregard or minimise anything that contradicts it.

There are two phases of confirmation bias: the first is theory formation. Thanks to subject matter expertise extrapolation, like the man with a hammer you are primed to see nails. Suspicious that strangely-behaving Lindy Chamberlain might have something to hide, Police searched the family’s possessions, tent and vehicle, eventually finding something forensic tests suggested was consistent with “fetal blood” splattered up inside the footwell of her husband’s car. (Much later the substance was retested. It turned out to be sound-deadening material containing iron oxide.) This set the police in a direction. and before long they had constructed an elaborate theory of how Lindy Chamberlain had committed murder.

Evidence when taken in the abstract, would tend to “be consistent with” innocence can, with confirmation bias, be presented as damning. The innocent defendant who maintains innocence notwithstanding his wrongful conviction is presented as callous, lacking remorse and may expect a harsher sentence and less chance of parole. What is a wrongful convict meant to do?

An innocent defendant who, during a period of misfortune for which she was not responsible, went out with her friends, posting pictures of herself as a vivacious young woman who enjoyed socialising — is portrayed as thereby a callous, calculating demon. But these are exactly the behaviours to be expected from an innocent person. This ought to point away from the defendant, though with confirmation bias it points towards her.

Selective information processing

Focusing on certain pieces of evidence while ignoring others. Prosecutors might only present evidence that strengthens their case and neglect exculpatory evidence that could help the defence. Peter Ellis’ prosecutors interviewed one hundred and eighteen children. Many denied all suggestion of abuse. Some gave plainly preposterous accounts of what went on. They were not called to give evidence. Their statements do not appear to have been fully disclosed to the defence.

Groupthink

Thinking or making decisions as a group in a way that discourages creativity or individual responsibility: Prosecutors might conform to the prevailing opinion within their office, stifling dissenting views and critical analysis of the case. See also Dan Davies’ idea of the “accountability sinks” within an organisation. The Post Office Horizon IT scandal is perhaps the archetypal example of groupthink amongst a group of prosecuting lawyers.

Reductionism

Immersing into technical details that, by themselves and shorn of all context, seem to lead to one conclusion — especially one to which you are already anchored — notwithstanding the wider picture making the hypothesis unlikely. Especially in cases with no direct evidence, there is a great risk of this.

Staying there

Once we are in a tunnel, there are cognitive biases that prevent us from finding our way out.

Hindsight bias and the reiteration effect

In hindsight, people tend to think an eventual outcome was inevitable or more likely than they might have before it happened. “What is the chance that that nice woman you met at the campsite just now will, in three hours, brutally murder her own nine-week-old infant?” versus “Given that this woman’s nine-week-old infant has disappeared from the campsite, and the police suspect her of foul play, what is the change that she brutally murdered her own child?”

Through “hindsight bias” we project knowledge of actual outcomes onto our knowledge of the observed behaviour in the past, without realising that the perception of the past has been tainted by the subsequent information.

Once a person becomes a prime suspect hindsight bias suggests that, upon reflection, the suspect was the inevitable and likely suspect from the beginning. Evidence is malleable in light of this “realisation”.

There is also a “reiteration” effect. Our confidence in a theory increases the more we hear it, independent of its validity. The longer that police, prosecutors and witnesses live with the conclusion of guilt, the more invested in it they are, the more entrenched their conclusion becomes, and the more obvious it appears that all evidence pointed that way from the very beginning. It is increasingly difficult for police and prosecutors to consider alternative perpetrators or theories of a crime.

Circular correspondence bias: Randle McMurphy’s dilemma

Correspondence bias, or the “fundamental attribution error” attributes observed behaviour to personality, or malice, without considering more probable situational factors. This you can, but few enough do, avoid with Hanlon’s razor:

“Do not attribute to malice things that can just as well be explained by stupidity.”

A strangely prevalent form of reasoning that combines it with circular logic as follows:

  1. There is weak circumstantial evidence that X committed a crime.
  2. The (highly unusual) traits of people who commit that sort of (highly unusual) crime are attributed to X (this is a “fundamental attribution error”).
  3. X’s newly-attributed traits are cited as evidence corroborating the existing weak circumstantial evidence (this is a circularity). X is now a “sociopath”, “narcissist”, “attention-seeker”, or even “Munchausen’s syndrome by proxy sufferer”, so is more likely to have committed the highly unusual crime.
  4. Other aspects of X’s behaviour which, but for the allegation, would be normal, now appear to verify the trait (this is a further circularity).
  5. X is convicted on the compelling evidence of the opportunity and possession of that highly unusual trait.

You might recognise this as the plot driver of Ken Kesey’s One Flew Over the Cuckoo’s Nest.

Bob the doctor

So, say there is an unusual cluster of deaths in the geriatric ward of a hospital. It coincides with the shift pattern of one doctor, Bob, raising a weak and easily rebuttable presumption of foul play on Bob’s part.

Before this statistical correlation, Bob was not under suspicion. Nor was his behaviour unusual.

But the foul play — if that’s what it is — is horrific: someone is systematically murdering little old ladies. Only a psychopath with a stone-cold heart and a narcissistic personality disorder could do that. If it is Bob, he must be a psychopath with a narcissistic personality disorder. This logic is already circular, but the circle is big and conditional enough not to be obvious.

We now start inspecting Bob’s behaviour. We find him to be meticulously tidy. He enjoys socialising, attending a regular salsa class on Wednesdays. He sent condolence cards to the families of several deceased patients.

Now: what could be more indicative of a stone-cold, narcissistic psychopath who has just murdered someone than a fastidiously tidy person — hence, no evidence, right? — who sent a card to the victim’s wife before going out drinking and dancing with his friends? The net is closing in.

Antidotes

Q: How many psychiatrists does it take to change a light bulb?
A: Just one; but the light bulb really has to want to change.

Some strategies to counteract the effect, but the predominant one is to want to keep an open mind.

Hanlon’s, and Otto’s razor

“Do not attribute to malice things that can just as well be explained by stupidity.”

Hanlon’s razor

Don’t assume malice where stupidity will do; likewise, per Otto’s razor, don’t attribute to virtue something that could equally be attributed to self-interest; or to skill something that could equally be attributed to dumb luck.

  1. Christchurch Journalist Martin Van Beynen’s fantastic podcast Black Hands compellingly makes this case.
  2. They might snap into a sudden orgy extreme violence — but this plays out as desperate, meltdown mass murder, not calculated ongoing serial murder, and there is generally no doubt that it is murder and no shortage of direct evidence implicating the accused.
  3. Mental illnesses having a clear medical pathology, not suspiciously made-up ones out of ex- post facto symptoms like “Munchausen by proxy”. See the “circular correspondence bias” discussion below.
  4. New Law Journal: The Trouble With “Sure”
  5. In fairness the crown submitted expert forensic analysis entered that it was specifically infant blood, so you can hardly fault the jury here. You can fault the crown forensics team though: it turned out to be acoustic deadening spray and not blood of any kind!
  6. JC draws upon The Multiple Dimensions of Tunnel Vision in Criminal Cases by Keith Findley and Michael Scott in the Wisconsin Law Review (2006).
  7. Neil Postman, Technopoly: The Surrender of Culture to Technology, 1992.
  8. In the cases the JC has managed to track down, not one involves anyone seeing the accused commit any unequivocal act of harm
  9. There is some debate about expert witness conflicts of interest in criminal law circles but, given how public interest there is in the question, it has had little public exposure as yet: perhaps Post Office Horizon IT scandal and the Letby case will change that.
  10. Assume you test 100,000 people. At a 99.9% accuracy rate, you will expect 99,900 correct results, and 100 false ones. But in that sample of 100,000 you would only expect 1 actual case of the condition. So if you are tested and receive a positive result, you have a 1 in 100 chance that it is a true result.
  11. It turned out the statistics were in any case wrong: the probability of her shift patterns coinciding by chance was reassessed as being more like one in twenty-five. De Berk’s acquittal was overturned in 2010.