Template:M intro crime tunnel vision: Difference between revisions

From The Jolly Contrarian
Jump to navigation Jump to search
No edit summary
No edit summary
Line 3: Line 3:
: — ''The Multiple Dimensions of Tunnel Vision in Criminal Cases'' by Keith Findley and Michael Scott (2006)}}
: — ''The Multiple Dimensions of Tunnel Vision in Criminal Cases'' by Keith Findley and Michael Scott (2006)}}
{{d|Prosecutor’s tunnel vision|/ˈprɒsɪkjuːtəz/ /ˈtʌnᵊl/ /ˈvɪʒᵊn/|n}}
{{d|Prosecutor’s tunnel vision|/ˈprɒsɪkjuːtəz/ /ˈtʌnᵊl/ /ˈvɪʒᵊn/|n}}
The collection of [[bias]]es and misconceptions that can lead prosecutors — being those who assert a particular model or [[narrative]] about the world, and not just ''literal'' criminal prosecutors — to frame and maintain curious views, seemingly in the face of common sense. In putting this together, [[JC]] draws on ''{{plainlink|https://media.law.wisc.edu/m/2fjzd/findley_scott_ssrn_copy-1.pdf|The Multiple Dimensions of Tunnel Vision in Criminal Cases}}'' by Keith Findley and Michael Scott in the Wisconsin Law Review (2006) and {{author|Robert Cialdini}}’s {{br|Persuasion}}. To some extent also the madness of crowds and Jon Haidt’s {{brilliant|The Righteous Mind}}. The essence of all if these are ''we are not nearly as logical as we think''. Prosecutor’d tunnel vision can therefore form a useful template for screening all kinds of bad [[decision-making]] environments.  
{{drop|T|he collection of}} [[bias]]es and misconceptions that can lead ''prosecutors'' in the widest sense, being those who assert a particular model or [[narrative]] about the world, and not just ''literal'' criminal prosecutors — to frame and maintain curious views, often flying in the face of common sense. [[JC]] draws upon ''{{plainlink|https://media.law.wisc.edu/m/2fjzd/findley_scott_ssrn_copy-1.pdf|The Multiple Dimensions of Tunnel Vision in Criminal Cases}}'' by Keith Findley and Michael Scott in the Wisconsin Law Review (2006) and {{author|Robert Cialdini}}’s {{br|Persuasion}}. To some extent also the madness of crowds and Jon Haidt’s {{br|The Righteous Mind}}. The lesson we draw is that ''we are not as rational as we like to think''.  


It may be that this syndrome describes ''all'' decision-making — especially of a “conviction” kind, like political and religious ones — that, once they have taken root, are hard to defeat.
Prosecutor’s tunnel vision therefore is a useful template for screening all kinds of bad [[decision-making]] environments.  


The “wrongful conviction” cases are bracing because, with hindsight, it is so hard to understand the compulsion to stick with plainly untenable views. If we treat {{ptv}} as akin to a religious or political conviction, we can understand better how “prosecutors” can be so energetic in their maintenance of a bad [[model]]. It perhaps explains the gruesome in-house performance in the {{poh}}.
It may describe ''all'' view-forming of a “conviction” kind. They are like political and religious views in that, once they take root, theyn are not easily displaced.


Here a “prosecutor” is the prosecutor of a ''theory'': of innocence or guilt. The [[Lucy Letby]] dialogue is characterised by proponents on both sides who are equally afflicted with {{ptv}} and few committing to dispassion: hence intemperate allegations of conspiracy from both sides.
The “wrongful conviction” cases are bracing because, with hindsight, armed with a better narrative and not having taken the the cognitive path that led prosecutors to their position, it is so hard to understand how they got there, or why they persist with such plainly untenable views. If we treat prosecutor’s tunnel vision as a variety of political or even religious conviction, we can see better how “prosecutors” can be so energetic in their maintenance of a bad [[model]]. It perhaps explains the gruesome in-house performance in the {{poh}}.


There are three stages of a tunnel vision scenario: the first establishes conditions that make us ''vulnerable'' to tunnel vision; second are those considerations that push us ''down'' a given tunnel; the third are those that ''keep'' us there once we go there.  
Proseutors need not be ''literal'' prosecutors: campaigners for innocence, and conspiracy theorists suffer at the hands of the same collection of cognitive traps. Both sides of the public conversation about [[Lucy Letby]] seem similarly afflicted with tunnel vision: hence, allegations of conspiracy from both sides.
 
Tunnel vision has three phases: first, the background conditions arise to make us ''vulnerable'' to tunnel vision; the second are those considerations that push us into a given tunnel; the third are those congitive artefacts that ''keep'' us there.  


Call these “setting out”, “getting there” and “staying there”.
Call these “setting out”, “getting there” and “staying there”.
Line 17: Line 19:
In order of appearance:  
In order of appearance:  


====Setting out====
==Background==
{{Drop|T|o get to}} a position where tunnel vision might toddle hold,we need:
{{Drop|T|o get to}} a position where tunnel vision might toddle hold,we need:
====Anchoring effect====
When making decisions we tend to rely on the first piece of information we get. Initial impressions can disproportionately influence our strategy and assessment of responsibility. This is the [[anchoring]] effect.
====Overconfidence in expertise====
Those engaged as [[subject matter expert]]s may overestimate their ability to diagnose and judge subtle problems, especially those that are adjacent (nor nearly adjacent) to an area of expertise. Experts tend to over-emphasise their own domains in forming conclusions — this is an entry condition to any professional calling — when it might be a relatively minor part of the story.
Where we are overly confident in our case — where we see its essential ''rectitude'' — we are less likely to consider alternatives, different models, theories or even evidence.


=====Overconfidence=====
====To a man with a hammer...====
An unjustified belief in our own ability to diagnose and judge subtle problems. Where we are overly confident in our case — where we see its essential ''rectitude'' — we are less likely to consider alternatives, different models, theories or even evidence.
Cadets don’t sign up for basic training hoping not to shoot a gun.  


=====Anchoring effect=====
The police show up for work to detect crime. Prosecutors to prosecute it. They are primed this way, as we all are: we like to be ''useful'': if presented with a mystery an expert’s ideal outcome is not “there is nothing to see here, folks”, and to pack up her equipment and go home.
When making decisions we tend to rely on the first piece of information we get. Initial impressions can disproportionately influence our strategy and assessment of responsibility. This is the [[anchoring]] effect.
 
The dogged sleuth who smelled something fishy, stuck at it over the objections of her superiors and endured the shadowy  machinations unseen malign forces is a literary archetype for a reason. We don’t hear so much about the obsessive geek who hounded an innocent woman to her grave. This is not a story we like to hear. (Survivorship bias).
 
Someone analogous to the “to a man with a hammer, everything looks like a nail” syndrome:
====Role pressure====
If there has been some apparent calumny, no-one will be satisfied by an agency who de-escalates. Officials will be under tremendous pressure to get results: West Yorkshire police missed opportunities to apprehend Peter Sutcliffe because of this, despite interviewing him on a number of occasions, because he didn’t meet a profile based on letters from “Wearside Jack”, who turned out to be a hoaxer. The pressure to hasten to convictions may arise due to career aspirations, public expectations, or institutional culture.
 
The literal meaning of “[[iatrogenic]]” — an illness produced by the treatment.
 
==Getting there==
====Extrapolation====
Especially vulnerable are [[subject matter expert]]s. We are natural problem-solvers and model builders and we will easily slip beyond our brief and intrude into matters where we have little experience. A statistician can give us a compelling account of the Bayesian probabilities, but when she strays into the causes of elevated insulin readings in a sample containing an apparently significant cluster of readings. Likewise, a medical expert may opine ''that'' the insulin readings are elevated beyond what would usually be expected, but the forensic question of ''who or what caused the excess insulin levels'' is a question of forensics and not something a diabetes specialist has any better idea about than anyone else.
 
====Precedent: bad heuristics====
The greater the expertise, the more grooved the expectations, the stronger the [[heuristic]], the greater the temptation to take that shortcut and ''presume'' that this is one of those cases. In the great preponderance of cases, this is a handy, efficient shortcut, but in the rare case that presents in a certain way but is an exception, it can be dangerous. Heuristics are excellent, efficient devices (as Gerd Gigernzer notes, they help us to catch balls without needing to perform differential equations), but when the model is wrong they can lead to trouble.
 
====Confirmation bias====
Professor A J Chalmers noted that observation is theory dependent: scientists must first have a hypothesis to test before they can gather evidence, for otherwise you cannot know what evidence  is even relevant. If observing cars to form laws of motion, is  their colour relevant?<ref>Yes, is the counterintuitive answer. The “doppler effect” shifts the visible wavelength of light. But you have to have a theory that predicts this before you even know to look for it.</ref>
Having picked a bad [[heuristic]], we tend to seek out, interpret, and best remember information that confirms it. We may overweight evidence that supports our theory and disregard or minimise anything that contradicts it.


=====Role pressure=====
====Selective information processing====
The central investigation department is there to clear complaints and uphold prosecutions. No-one will be satisfied by an agency who
Focusing on certain pieces of evidence while ignoring others. Prosecutors might only present evidence that strengthens their case and neglect exculpatory evidence that could help the defense.
The pressure to fulfil the expectations and responsibilities of one’s role. Someone analogous to the “to a man with a hammer, everything looks like a nail” syndrome: prosecutors may feel intense pressure to secure convictions due to career advancement considerations, public expectations, or institutional culture. The literal meaning of “[[iatrogenic]]” — an illness produced by the treatment.


====Getting there====
====Groupthink====
'''[[Confirmation bias]]''': The tendency to search for, interpret, and remember information that confirms pre-existing beliefs or hypotheses. Prosecutors may give undue weight to evidence that supports their case and disregard or minimize evidence that contradicts it.
Thinking or making decisions as a group in a way that discourages creativity or individual responsibility: Prosecutors might conform to the prevailing opinion within their office, stifling dissenting views and critical analysis of the case. See also [[Dan Davies]]’ idea of  “[[accountability sink]]s”. {{poh}} is perhaps the archetypal example of groupthink.


'''[[Selection bias|Selective information processing]]''': Focusing on certain pieces of evidence while ignoring others. Prosecutors might only present evidence that strengthens their case and neglect exculpatory evidence that could help the defense.
====Reductionism====
Drilling deep into technical details that, by themselves, and shorn of all context, seem to lead to one conclusion — especially one you are already [[anchor]]ed to — notwithstanding the wider picture making the hypothesis unlikely. Especially in cases with no direct evidence, there is a great risk of this.  


'''[[Groupthink]]''': Thinking or making decisions as a group in a way that discourages creativity or individual responsibility: Prosecutors might conform to the prevailing opinion within their office, stifling dissenting views and critical analysis of the case.
Prosecutors’ focus on “blood”  sprayed up in the footwell of the Chamberlains car led them to a theory that Azaria was murdered there, despite no evidence supporting the theory, and quite a lot — principally, the lack of time for Lindy Chamberlain to do any such thing. The prosecution case started with “murder in the car” as the anchoring evidence, and hypothesised a whole story around it, for which there was no supporting evidence but also no contradiucting evidence, so it was “possible”. There is a ''lot'' of this in the Lucy Letby case, on both sides.


'''[[Reductionism]]''': Drilling deep into technical details that, by themselves, and shorn of all context, seem to lead to one conclusion — especially one you are already anchored to — notwithstanding the wider picture
==Staying there==
==== Hindsight bias and the reiteration effect ====
In hindsight, people tend to think an eventual outcome was inevitable, or more likely or predictable, than they might have before it happened. “What is the chance that that nice woman you met at the campsite just now will, in three hours, brutally murder her own nine-week old infant?” versus “Given that this nine-week old child has disappeared from the campsite, and the police suspect the mother of foul play, what is the prospect that her mother brutally murdered the child?”


====Staying there====
Through “hindsight bias” we project new knowledge (of actual outcomes) onto our knowledge of the past (observed behaviour), without realising that the perception of the past has been tainted by the subsequent information.
'''[[Hindsight bias]]''': In hindsight, people tend to think an eventual outcome was inevitable, or more likely or predictable, than originally expected. “Hindsight bias” is a means through which people project new knowledge (of actual outcomes) onto knowledge of the past (observed behaviour), without realising that the perception of the past has been tainted by the subsequent information.


Once a person becomes a prime suspect and prosecutors arrive at an outcome in their own determination of who ''they'' believe is guilty — hindsight bias suggests that, upon reflection, the suspect was the inevitable and likely suspect from the beginning. Evidence is malleable in light of this “realisation”.  
Once a person becomes a prime suspect and prosecutors arrive at an outcome in their own determination of who ''they'' believe is guilty — hindsight bias suggests that, upon reflection, the suspect was the inevitable and likely suspect from the beginning. Evidence is malleable in light of this “realisation”.  


There is also a “'''reiteration effect'''”: A reiteration effect is also linked to hindsight bias. Confidence in an assertion increases the more it is repeated, independent of its truth or falsity. Accordingly, the longer that police, prosecutors and witnesses live with a conclusion of guilt, the more entrenched their conclusion becomes, and the more obvious it appears that all evidence pointed to that conclusion from the very beginning. This “reiteration effect” makes it increasingly difficult for police and prosecutors to consider alternative perpetrators or theories of a crime.
This is compounded by a “reiteration” effect. Our confidence in a theory increases the more we hear it, independent of its truth or falsity. The longer that police, prosecutors and witnesses live with a conclusion of guilt, the more entrenched their conclusion becomes, and the more obvious it appears that all evidence pointed to that conclusion from the very beginning. This “reiteration effect” makes it increasingly difficult for police and prosecutors to consider alternative perpetrators or theories of a crime.


'''Outcome bias''':  Like hindsight bias, outcome bias involves a process in which people project outcomes onto the past without realising the outcome information has influenced their perception of the past, but it does not reflect judgments about the
====Outcome bias====
likelihood of an event, but about the quality of a suspect’s decision. Subjects are more likely to judge as bad a suspect’s decision to operate when they are told the patient died during surgery than when told the patient survived.
Like hindsight bias, “outcome bias” involves projecting subsequent “outcomes” onto observed behaviour, only about the quality of a suspect’s decision. Subjects are more likely to judge as bad a suspect’s decision to operate when they are told the patient died during surgery than when told the patient survived. This is the operator error presumption from {{fieldguide}}


'''Sunk cost fallacy''': The inclination to continue an endeavour once money, effort, time ''or credibility'' has been invested, even when new evidence suggests the defendant might be innocent. (see also [[commitment]] when talking about [[persuasion]])
====Sunk cost fallacy====
The inclination to continue an endeavour once money, effort, time ''or credibility'' has been invested, even when new evidence suggests the defendant might be innocent. (see also [[commitment]] when talking about [[persuasion]])


'''[[Cognitive dissonance]]''': The discomfort experienced when holding two conflicting cognitions.  To reduce discomfort, prosecutors may rationalize or dismiss information that challenges their belief in the defendant's guilt.
'''[[Cognitive dissonance]]''': The discomfort experienced when holding two conflicting cognitions.  To reduce discomfort, prosecutors may rationalize or dismiss information that challenges their belief in the defendant's guilt.

Revision as of 07:56, 9 July 2024

By tunnel vision, we mean that “compendium of common heuristics and logical fallacies,” to which we are all susceptible, that lead actors in the criminal justice system to “focus on a suspect, select and filter the evidence that will ‘build a case’ for conviction, while ignoring or suppressing evidence that points away from guilt.” This process leads investigators, prosecutors, judges, and defence lawyers alike to focus on a particular conclusion and then filter all evidence in a case through the lens provided by that conclusion.

The Multiple Dimensions of Tunnel Vision in Criminal Cases by Keith Findley and Michael Scott (2006)

Prosecutor’s tunnel vision
/ˈprɒsɪkjuːtəz/ /ˈtʌnᵊl/ /ˈvɪʒᵊn/ (n.)
The collection of biases and misconceptions that can lead prosecutors — in the widest sense, being those who assert a particular model or narrative about the world, and not just literal criminal prosecutors — to frame and maintain curious views, often flying in the face of common sense. JC draws upon The Multiple Dimensions of Tunnel Vision in Criminal Cases by Keith Findley and Michael Scott in the Wisconsin Law Review (2006) and Robert Cialdini’s Persuasion. To some extent also the madness of crowds and Jon Haidt’s The Righteous Mind. The lesson we draw is that we are not as rational as we like to think.

Prosecutor’s tunnel vision therefore is a useful template for screening all kinds of bad decision-making environments.

It may describe all view-forming of a “conviction” kind. They are like political and religious views in that, once they take root, theyn are not easily displaced.

The “wrongful conviction” cases are bracing because, with hindsight, armed with a better narrative and not having taken the the cognitive path that led prosecutors to their position, it is so hard to understand how they got there, or why they persist with such plainly untenable views. If we treat prosecutor’s tunnel vision as a variety of political or even religious conviction, we can see better how “prosecutors” can be so energetic in their maintenance of a bad model. It perhaps explains the gruesome in-house performance in the Post Office Horizon IT scandal.

Proseutors need not be literal prosecutors: campaigners for innocence, and conspiracy theorists suffer at the hands of the same collection of cognitive traps. Both sides of the public conversation about Lucy Letby seem similarly afflicted with tunnel vision: hence, allegations of conspiracy from both sides.

Tunnel vision has three phases: first, the background conditions arise to make us vulnerable to tunnel vision; the second are those considerations that push us into a given tunnel; the third are those congitive artefacts that keep us there.

Call these “setting out”, “getting there” and “staying there”.

In order of appearance:

Background

To get to a position where tunnel vision might toddle hold,we need:

Anchoring effect

When making decisions we tend to rely on the first piece of information we get. Initial impressions can disproportionately influence our strategy and assessment of responsibility. This is the anchoring effect.

Overconfidence in expertise

Those engaged as subject matter experts may overestimate their ability to diagnose and judge subtle problems, especially those that are adjacent (nor nearly adjacent) to an area of expertise. Experts tend to over-emphasise their own domains in forming conclusions — this is an entry condition to any professional calling — when it might be a relatively minor part of the story.

Where we are overly confident in our case — where we see its essential rectitude — we are less likely to consider alternatives, different models, theories or even evidence.

To a man with a hammer...

Cadets don’t sign up for basic training hoping not to shoot a gun.

The police show up for work to detect crime. Prosecutors to prosecute it. They are primed this way, as we all are: we like to be useful: if presented with a mystery an expert’s ideal outcome is not “there is nothing to see here, folks”, and to pack up her equipment and go home.

The dogged sleuth who smelled something fishy, stuck at it over the objections of her superiors and endured the shadowy machinations unseen malign forces is a literary archetype for a reason. We don’t hear so much about the obsessive geek who hounded an innocent woman to her grave. This is not a story we like to hear. (Survivorship bias).

Someone analogous to the “to a man with a hammer, everything looks like a nail” syndrome:

Role pressure

If there has been some apparent calumny, no-one will be satisfied by an agency who de-escalates. Officials will be under tremendous pressure to get results: West Yorkshire police missed opportunities to apprehend Peter Sutcliffe because of this, despite interviewing him on a number of occasions, because he didn’t meet a profile based on letters from “Wearside Jack”, who turned out to be a hoaxer. The pressure to hasten to convictions may arise due to career aspirations, public expectations, or institutional culture.

The literal meaning of “iatrogenic” — an illness produced by the treatment.

Getting there

Extrapolation

Especially vulnerable are subject matter experts. We are natural problem-solvers and model builders and we will easily slip beyond our brief and intrude into matters where we have little experience. A statistician can give us a compelling account of the Bayesian probabilities, but when she strays into the causes of elevated insulin readings in a sample containing an apparently significant cluster of readings. Likewise, a medical expert may opine that the insulin readings are elevated beyond what would usually be expected, but the forensic question of who or what caused the excess insulin levels is a question of forensics and not something a diabetes specialist has any better idea about than anyone else.

Precedent: bad heuristics

The greater the expertise, the more grooved the expectations, the stronger the heuristic, the greater the temptation to take that shortcut and presume that this is one of those cases. In the great preponderance of cases, this is a handy, efficient shortcut, but in the rare case that presents in a certain way but is an exception, it can be dangerous. Heuristics are excellent, efficient devices (as Gerd Gigernzer notes, they help us to catch balls without needing to perform differential equations), but when the model is wrong they can lead to trouble.

Confirmation bias

Professor A J Chalmers noted that observation is theory dependent: scientists must first have a hypothesis to test before they can gather evidence, for otherwise you cannot know what evidence is even relevant. If observing cars to form laws of motion, is their colour relevant?[1] Having picked a bad heuristic, we tend to seek out, interpret, and best remember information that confirms it. We may overweight evidence that supports our theory and disregard or minimise anything that contradicts it.

Selective information processing

Focusing on certain pieces of evidence while ignoring others. Prosecutors might only present evidence that strengthens their case and neglect exculpatory evidence that could help the defense.

Groupthink

Thinking or making decisions as a group in a way that discourages creativity or individual responsibility: Prosecutors might conform to the prevailing opinion within their office, stifling dissenting views and critical analysis of the case. See also Dan Davies’ idea of “accountability sinks”. Post Office Horizon IT scandal is perhaps the archetypal example of groupthink.

Reductionism

Drilling deep into technical details that, by themselves, and shorn of all context, seem to lead to one conclusion — especially one you are already anchored to — notwithstanding the wider picture making the hypothesis unlikely. Especially in cases with no direct evidence, there is a great risk of this.

Prosecutors’ focus on “blood” sprayed up in the footwell of the Chamberlains car led them to a theory that Azaria was murdered there, despite no evidence supporting the theory, and quite a lot — principally, the lack of time for Lindy Chamberlain to do any such thing. The prosecution case started with “murder in the car” as the anchoring evidence, and hypothesised a whole story around it, for which there was no supporting evidence but also no contradiucting evidence, so it was “possible”. There is a lot of this in the Lucy Letby case, on both sides.

Staying there

Hindsight bias and the reiteration effect

In hindsight, people tend to think an eventual outcome was inevitable, or more likely or predictable, than they might have before it happened. “What is the chance that that nice woman you met at the campsite just now will, in three hours, brutally murder her own nine-week old infant?” versus “Given that this nine-week old child has disappeared from the campsite, and the police suspect the mother of foul play, what is the prospect that her mother brutally murdered the child?”

Through “hindsight bias” we project new knowledge (of actual outcomes) onto our knowledge of the past (observed behaviour), without realising that the perception of the past has been tainted by the subsequent information.

Once a person becomes a prime suspect and prosecutors arrive at an outcome in their own determination of who they believe is guilty — hindsight bias suggests that, upon reflection, the suspect was the inevitable and likely suspect from the beginning. Evidence is malleable in light of this “realisation”.

This is compounded by a “reiteration” effect. Our confidence in a theory increases the more we hear it, independent of its truth or falsity. The longer that police, prosecutors and witnesses live with a conclusion of guilt, the more entrenched their conclusion becomes, and the more obvious it appears that all evidence pointed to that conclusion from the very beginning. This “reiteration effect” makes it increasingly difficult for police and prosecutors to consider alternative perpetrators or theories of a crime.

Outcome bias

Like hindsight bias, “outcome bias” involves projecting subsequent “outcomes” onto observed behaviour, only about the quality of a suspect’s decision. Subjects are more likely to judge as bad a suspect’s decision to operate when they are told the patient died during surgery than when told the patient survived. This is the operator error presumption from Sidney Dekker’s The Field Guide to Human Error Investigations

Sunk cost fallacy

The inclination to continue an endeavour once money, effort, time or credibility has been invested, even when new evidence suggests the defendant might be innocent. (see also commitment when talking about persuasion)

Cognitive dissonance: The discomfort experienced when holding two conflicting cognitions. To reduce discomfort, prosecutors may rationalize or dismiss information that challenges their belief in the defendant's guilt.

Belief perseverance: Maintaining a belief despite new information that firmly contradicts it. Even in the face of strong contrary evidence, prosecutors may cling to their original theory.

Ethical blindness: The inability to see the ethical dimensions of a situation due to focusing on other aspects. Prosecutors might neglect the ethical implications of their actions in the pursuit of winning a case.

  1. Yes, is the counterintuitive answer. The “doppler effect” shifts the visible wavelength of light. But you have to have a theory that predicts this before you even know to look for it.