edwin colfax the justice project october 8, 2010

53
Forensic Flaws and Wrongful Convictions Edwin Colfax The Justice Project October 8, 2010

Upload: jayden-highsmith

Post on 31-Mar-2015

216 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Edwin Colfax The Justice Project October 8, 2010

Forensic Flaws and Wrongful Convictions

Edwin ColfaxThe Justice Project

October 8, 2010

Page 2: Edwin Colfax The Justice Project October 8, 2010

Related by Evan Hodge, former chief of the FBI Firearms and Toolmark Unit:

A detective goes to a ballistics expert along with a .45 pistol and a bullet recovered from a murder victim.

“We know this guy shot the victim and this is the

gun he used. All we want you to do is confirm what we already know so we can get a warrant to get the scumbag off the street. We will wait. How quick can you do it?”

An Anecdote. . .

Page 3: Edwin Colfax The Justice Project October 8, 2010

The analyst conducted tests and provided a finding that linked the slug to the gun.

The suspect was confronted with this damning forensic evidence in an interrogation that ended with his confession.

An Anecdote. . .

Page 4: Edwin Colfax The Justice Project October 8, 2010

The defendant then led the police to a different .45 pistol, which tests later showed was the true murder weapon.

-- Evan Hodge, Guarding Against Error, 20 Ass’n Firearms & Toolmark Examiners’ J. 290, 292 (1988).

An Anecdote. . .

Page 5: Edwin Colfax The Justice Project October 8, 2010

Discovered wrongful convictions◦ Prevalence of forensic flaws in those cases◦ Ways these cases are and are not representative

A provisional taxonomy of forensic flaws◦ Many different ways things can go wrong◦ Different responses required to address them

A closer look at DNA exoneration cases◦ Flawed forensic testimony◦ Relevancy of these errors today, and lessons

Overview

Page 6: Edwin Colfax The Justice Project October 8, 2010

To date, 259 people have been exonerated in the United States by post-conviction DNA testing.

Texas leads the nation with 40 (42) DNA exonerations.

Wrongful Convictions

Page 7: Edwin Colfax The Justice Project October 8, 2010

In more than 50% of DNA exonerations, unvalidated or improper forensic science contributed to the wrongful conviction, making it the second most prevalent factor contributing to wrongful convictions.

--The Innocence Project

DNA Exonerations and Forensic Problems

Page 8: Edwin Colfax The Justice Project October 8, 2010

In 60% of DNA exoneration cases, “forensic analysts called by the prosecution provided invalid testimony at trial, that is, testimony with conclusions misstating empirical data or wholly unsupported by empirical data.”

-Garrett/Neufeld Study: Invalid Forensic Science Testimony and Wrongful Convictions, Virginia Law Review, March 2009.

DNA Exonerations and Forensic Problems

Page 9: Edwin Colfax The Justice Project October 8, 2010

DNA demonstrates the power of forensic science and exposes previously unrecognized limitations of forensic science.

The overwhelming majority of criminal cases are not amenable to DNA testing.

Over-representation of certain kinds of cases: rape and, to a lesser degree, murder.

The Limits of DNA

Page 10: Edwin Colfax The Justice Project October 8, 2010

Discovered wrongful convictions represent only a tiny fraction of criminal convictions, BUT the discovered error is not all of the error.

In non-DNA cases, the innocent have a very difficult, virtually insurmountable burden.

While the true rate of wrongful conviction is difficult to know, we should be focusing on the fact that much (not all) of this error is preventable. We have identified patterns of preventable error.

The costs of wrongful conviction are so grave and profound – loss of liberty, public safety, collateral damage, financial—that we must take all reasonable steps to reduce the risk.

Discovered Wrongful Convictions and the Magnitude of the Problem

Page 11: Edwin Colfax The Justice Project October 8, 2010

Deliberate Misconduct Technical Incompetence Unvalidated Methodologies Communication Errors Interpretive Errors

Taxonomy of Forensic Flaws

Page 12: Edwin Colfax The Justice Project October 8, 2010

Deliberate Misconduct◦ Dry-labbing: Lying about what tests were done◦ Intentional withholding of exculpatory findings◦ Intentional falsification of reports◦ Faked Autopsies◦ Cheating on proficiency exams

Eg. - Fred Zain (WV, TX), Joyce Gilchrist (OK), Ralph Erdmann (TX)

Taxonomy of Forensic Flaws

Page 13: Edwin Colfax The Justice Project October 8, 2010

Incompetence◦ Inadequate training◦ Contamination◦ Poor quality control◦ Facilities and Equipment Problems◦ Chain of Custody Problems◦ Collection problems◦ Storage

Taxonomy of Forensic Flaws

Page 14: Edwin Colfax The Justice Project October 8, 2010

Unvalidated Methodologies◦ Dog Scent Lineups◦ Bullet Lead Analysis◦ Voice Print Analysis◦ Lip Prints ◦ Microscopic Hair Comparison◦ Predictions of future dangerousness

Taxonomy of Forensic Flaws

Page 15: Edwin Colfax The Justice Project October 8, 2010

Communication Errors◦ Disclosing available evidence◦ Ordering Tests◦ Follow-up on Reference Samples◦ Changes in Scientific Standards◦ Testimony Problems

Mistaken elaboration (false conclusions) Exaggerating evidentiary significance

Explicitly By Omission

Taxonomy of Forensic Flaws

Page 16: Edwin Colfax The Justice Project October 8, 2010

Interpretation Errors ◦ Inadvertent bias

Confirmation bias (investigative tunnel vision) Domain-extraneous information (manipulation

of decision threshold) Group-think/role bias.

Taxonomy of Forensic Flaws

Page 17: Edwin Colfax The Justice Project October 8, 2010

Deliberate Misconduct◦ Closer monitoring, full documentation, redundancy, audits,

full discovery (incl. bench notes) Incompetence

◦ Training, certification, monitoring, audits, blind proficiency testing

Unvalidated Methodologies ◦ Research, documentation, standardization

Communication Errors◦ Closer review of testimony, standardization of reporting

and relevant terminology Interpretive Errors

◦ Regulating flow of information, independence of labs

Many Solutions for Many Problems

Page 18: Edwin Colfax The Justice Project October 8, 2010

Invalid Forensic Science Testimony and Wrongful Convictions, Virginia Law Review, March 2009.

  Took documented DNA exoneration cases (232 at the

time of the study last year, now up to 259), identified those that included forensic science testimony (156), reviewed all of those transcripts they could obtain (137).

  “In 82 of these cases, or 60%, forensic analysts called by

the prosecution provided invalid testimony at trial, that is, testimony with conclusions misstating empirical data or wholly unsupported by empirical data.”

The Garrett/Neufeld Study

Page 19: Edwin Colfax The Justice Project October 8, 2010

There were a total of 72 different analysts who presented invalid forensic testimony against a person that would later be proven demonstrably innocent.

They worked for 52 different labs, in 25 states.

A few bad apples?

Page 20: Edwin Colfax The Justice Project October 8, 2010

“The adversarial process largely failed to police this invalid testimony. Defense counsel rarely cross-examined analysts concerning invalid testimony, and rarely obtained experts of their own.”

The Adversarial Process

Page 21: Edwin Colfax The Justice Project October 8, 2010

Only looked at problems in how evidence was presented in testimony, not underlying errors in the actual analysis, or whether methodologies are sound.

   The authors call for new oversight

mechanisms for reviewing forensic testimony and the development of clear scientific standards for written reports and testimony.

The Garrett/Neufeld Study

Page 22: Edwin Colfax The Justice Project October 8, 2010

Most fell into 2 categories: 1. Serology (Analysis of bodily fluids to

determine blood type characteristics). Type A/B/O, plus “secretor status” – whether a person secretes blood type substances into bodily fluids (eg. saliva or semen) or not.

2. Microscopic Hair Comparison (MHC) – The visual comparison of questioned hairs from a crime scene and known exemplars to determine presence of shared characteristics. 

What kinds of flawed forensic testimony occurred?

Page 23: Edwin Colfax The Justice Project October 8, 2010

Other kinds of forensic testimony that clearly went beyond empirical standards included:◦ forensic odontology (bite mark)◦ shoe print◦ fingerprint◦  

Three additional cases involved withholding of exculpatory forensic evidence (discovered through post-conviction proceedings).

What kinds of flawed forensic testimony occurred?

Page 24: Edwin Colfax The Justice Project October 8, 2010

100 cases involved serology, 57 flawed (57%). 65 MHC, 25 cases invalid testimony (38%). 13 fingerprint cases, 1 flawed testimony. 11 DNA cases, 3 with flawed testimony. 6 forensic odontology, 4 with invalid

testimony. 4 shoe print comparisons, 1 flawed testimony 1 voice comparison case with flawed

testimony

Incidence of Types of Analysis

Page 25: Edwin Colfax The Justice Project October 8, 2010

1.Misuse of empirical population data.Example, Gary Dotson – Testimony that Dotson

was among 11% that could have been the donor, when in fact it was 100%.

2. Unsupported conclusions about the probative value of evidence (eg. providing opinions on the significance of evidence without any empirical support).

Example, Durham case – Testimony that a particular reddish-yellow hair color occurs in about 5% of the population. No empirical data exists on the frequency of hair characteristics.

Two basic types of invalid testimony identified

Page 26: Edwin Colfax The Justice Project October 8, 2010

Cannot say that in all the cases where bad forensic testimony occurred that it “caused” the wrongful conviction, since there was usually other kinds of evidence involved, such as an eyewitness identification, that was presented.

Could the presence of other evidence have contributed to the faulty interpretation and testimony? (We tend to see/find what we expect or desire to find, or what makes the most sense given other things that we know.)

“Causation” is Unclear

Page 27: Edwin Colfax The Justice Project October 8, 2010

Byrd is a non-secretor. No antigens were detected on a stain at the crime scene, so the analyst assumed that the victim was also a non-secretor as well.

The analyst testified that 15-20% of the population are non-secretors.

In fact, no donor could be eliminated because no determination had been made about the victim's secretor status (so it's impossible to know whether her blood group markers masked the perpetrator's) and because the sample could have lacked antigens due to degradation. (Garrett/Neufeld, March 2009)

Kevin Byrd, TX

Page 28: Edwin Colfax The Justice Project October 8, 2010

The victim was a B secretor and Dominguez was an O secretor. Two of the tested stains had B and H antigens, which were consistent with the victim. However, the analyst testified that Dominguez could not be excluded and that O secretors comprise 36% of the population.

In fact, nobody in the population could be excluded because the victim's blood group markers could have masked the perpetrator's. (Garrett/Neufeld, March 2009)

Alejandro Dominguez, IL

Page 29: Edwin Colfax The Justice Project October 8, 2010

The victim and Dotson were both B secretors. B substances were found on the victim's underwear, and the analyst testified that that the donor was a B secretor. Those substances could have been entirely from the victim, so any male could have been the donor. Another stain had A antigens that were foreign to both Dotson and the victim, but the analyst failed to exclude Dotson as the source -- telling the court it could be a mixture of blood and sweat, wood, leather,detergents or other substances.

(Garrett/Neufeld, March 2009)

Gary Dotson, IL

Page 30: Edwin Colfax The Justice Project October 8, 2010

An analyst did not detect blood group substances in fluids from the crime. The analyst testified that this meant the perpetrator was a nonsecretor. In fact, if the victim was a non-secretor nobody could be excluded because her blood group markers could mask the perpetrator's, or the lack of blood group substances could have been the result of degradation.

(Garrett/Neufeld, March 2009)

Dennis Fritz, OK

Page 31: Edwin Colfax The Justice Project October 8, 2010

The victim and Green were both B secretors, and the stain showed both B and H antigens. The analyst testified that B secretors were 16% of the population; the analyst conclusively ruled out 84% of the population as the source. The testimony failed to account for the possibility that the victim's blood group markers could mask the perpetrator's. (Garrett/Neufeld, March 2009)

Michael Anthony Green, OH

Page 32: Edwin Colfax The Justice Project October 8, 2010

An analyst testified that the tested hair was "consistent“ with Honaker and concluded that it came from Honaker or someone of the same race, coloring and microscopic makeup: "It is unlikely that the hair would match anyone other than the defendant; but it is possible." (Garrett/Neufeld, March 2009)

Edward Honaker, VA

Page 33: Edwin Colfax The Justice Project October 8, 2010

An analyst testified that he was "certain" that Krone's teeth caused bites on the victim, and that it was "a very good match." He went on to say that bite mark comparison "has all the veracity, all the strength that a fingerprint would have." The prosecution also failed to disclose that an FBI expert had examined the bite marks and said they weren't from Krone. (Garrett/Neufeld,March 2009)

Ray Krone, AZ

Page 34: Edwin Colfax The Justice Project October 8, 2010

The victim was an A secretor and Laughman was a B-secretor. No B substances were detected in the evidence, but the analyst said bacteria could have "worked on these antigens" or they could have broken down. The analyst also testified that medications could have interfered with the antigens. The analyst then claimed that bacteria could actually convert one blood group substance to another: "Given sufficient time for those bacteria to act, it would be possible to convert a group A substance to a B, or a B substance to an A."

(Garrett/Neufeld, March 2009)

Barry Laughman, PA

Page 35: Edwin Colfax The Justice Project October 8, 2010

Incorrect Hair Analysis. Comparing hairs from the crime with Dedge's hair, an analyst testified that "it would not be a million white people" who would possess such hairs, and that "out of all the pubic hairs I have examined in the laboratory, I have never found two samples, two known samples to match in their microscopic characteristics." (Garrett/Neufeld, March 2009)

Wilton Dedge, FL

Page 36: Edwin Colfax The Justice Project October 8, 2010

Unlike MHC, the underlying science of serology is sound. There are very large blood type databases that allow us to say with confidence how prevalent various blood types are in the population, and among members of different ethnic groups.

  But serology is a limited forensic methodology. It

cannot individuate, but rather can only include (or exclude) a person in a category of people who may have contributed biological evidence found at a crime scene.

What went wrong with serology?

Page 37: Edwin Colfax The Justice Project October 8, 2010

 

We don’t use these anymore, we have more accurate method, so the problems have been fixed or eliminated by the new science.

Temptation to say that MHC and serology are old-fashioned, that our problem has been solved by DNA testing.

It is fair to say that we won’t see as many of the same kinds of errors anymore in cases where biological evidence is available, BUT. . . .

Lessons of MHC and Serology Error

Page 38: Edwin Colfax The Justice Project October 8, 2010

It would be a mistake to think that DNA has solved the problem of invalid forensic testimony.

“DNA has replaced some, but not most, traditional forensic methods.”

  Some estimates indicate that DNA testing

accounts for only 2% of police requests to crime laboratories.

 

Lessons of MHC and Serology Error

Page 39: Edwin Colfax The Justice Project October 8, 2010

  Biological Evidence is not available in the vast majority of

cases. Many other forensic methodologies are susceptible to the

problems that undermined accuracy in the MHC and serology areas.

It is an accident that we have happened to be able to establish conclusive error in so many MSC and serology cases. Only because of the presence of biological evidence.

If document examination and ballistics cases, for example, typically had biological material where DNA testing could be dispositive, we may well have exposed more errors in those other forensic disciplines.

What can/should we learn from MHC and Serology Errors?

Page 40: Edwin Colfax The Justice Project October 8, 2010

The vast majority of crime labs are operated by state and local law enforcement agencies.

The National Academies of Science urges that all crime labs be made independent of law enforcement agencies.

The Importance of Independence

Page 41: Edwin Colfax The Justice Project October 8, 2010

We have a number of other discredited methodologies that have been used repeatedly in court despite lacking any scientific validity.

Dog scent line-upsLip PrintsBullet lead analysis

Notable Non-DNA Cases

Page 42: Edwin Colfax The Justice Project October 8, 2010

For years FBI lab experts would conduct an analysis of the chemical or elemental composition of a slug recovered from a crime scene, and purport to match it to a particular batch of bullets, such that if a defendant had a box of .38 calliber ammunition , an analyst could testify that the box was a likely source of the bullet recovered from a shooting victim. It was high-tech, with a scanning electron microscope, and as dazzling as CSI, but it turns out that the testimony was scientifically invalid, and they don’t do it anymore.

Bullet Lead Analysis

Page 43: Edwin Colfax The Justice Project October 8, 2010

High profile non-dna reversal in a Dog Scent Lineup case.

IPOT did a must-read report on unscientific dog scent lineups focusing on a particular handler from Fort Bend County who testified in many cases.

Dog Scent Lineups

Page 44: Edwin Colfax The Justice Project October 8, 2010

Knaves or fools? Or something else?

Very often the evidence was interpreted in such a way that it was “made to fit” with the prosecution’s theory of the case.

The usual presence of other evidence of guilt, most often eyewitness evidence, creates expections for what the evidence will show.

What could explain errors?

Page 45: Edwin Colfax The Justice Project October 8, 2010

Either these many analysts in many labs committed intentional misconduct, or

They lacked a basic understanding of their work, or

Their interpretations were influenced by features of normal human psychology that resulted in inadvertent bias.

Inadvertent contextual bias is as plausible explanation for some

Page 46: Edwin Colfax The Justice Project October 8, 2010

We know that exposing analysts to “domain-extraneous information” can undermine the objective interpretation of evidence.

We know that perception and interpretation of evidence can be affected by our expectations and desires.

We know that decision thresholds can be affected by extraneous information and expectations.

Not merely theoretical

Page 47: Edwin Colfax The Justice Project October 8, 2010

Experience in other contexts, like clinical drug trials.

Real life cases like the Brandon Mayfield case.

Experimental research documenting influence of scientifically irrelevant information on the conclusions of real forensic experts.

How do we know?

Page 48: Edwin Colfax The Justice Project October 8, 2010

Traditionally the forensic science establishment has dismissed concerns about contextual bias and observer effects, appealing to the professionalism and training of analysts as a sufficient counter-weight to any potential influence.

Since we are dealing with aspects of human nature and normal human psychology, there is no basis to assume that these tendencies can be “trained away.”

It has only been VERY recently that the risks have begun to be taken seriously.

Admitting the risk of contextual bias is a crucial step

Page 49: Edwin Colfax The Justice Project October 8, 2010

There are many forensic disciplines, some of which are inherently have a more significant risk of contextual bias than others.

DNA analysis, drug testing, etc., are rooted in well-established scientific research and leave little room for subjective interpretation.

Other kinds of analysis have an interpretive element that is necessarily subjective, involving visual comparisons, for example.

Where the risk exists

Page 50: Edwin Colfax The Justice Project October 8, 2010

This is not to suggest that there are not valid and reliable methods that have a subjective interpretive element.

Rather, where there is a subjective interpretive element, there is real risk of inadvertent contextual bias that must be addressed.

Validity challenge vs. Risk Management

Page 51: Edwin Colfax The Justice Project October 8, 2010

Dror, I.E. and Rosenthal, R. (2008). Meta-analytically quantifying the reliability and biasability of forensic experts. Journal of Forensic Sciences, 53(4), 900-903. (Citing the underlying studies).

Experiments that look at “within-expert comparisons,” whereby the same expert unknowingly makes judgments on the same data at different times and with different contextual information, i.e. that the suspect had an alibi, that the suspect confessed, or, in one study, that the prints were the mistaken matches from the Mayfield case.

Experts, as humans, perceive and judge information based on circumstances, such as context, emotional states, expectations, and hopes. This is not a problem if the circumstances are relevant to their decision making, because by being relevant the new circumstances may actually change the decision problem itself. But what happens when experts are faced with extraneous circumstances which are not relevant and do not modify the decision problem?

Itiel Dror Experiments

Page 52: Edwin Colfax The Justice Project October 8, 2010

In Dror’s initial study, experienced analysts were asked to evaluate a series of fingerprints to determine if they matched. Though the analysts believed the prints were for an actual, open case, they were actually reexamining prints they had correctly evaluated in the past, this time accompanied by artificial contextual information, such as that the suspect had confessed. The results were striking. In cases where analysts were given contextual information about the fingerprints, they were wrong in almost seventeen percent of the cases. These errors were particularly notable because the same analysts had previously evaluated the prints correctly.

The Biasing Effects of Contextual Information

Page 53: Edwin Colfax The Justice Project October 8, 2010

Need to regulate the flow of information between analysts and investigators.◦ Evidence intake and control◦ Blind testing◦ Evidence lineups?

Labs independent of law enforcement agencies are less susceptible to risk of inadvertent bias related to contextual information or perceived role.

Managing risk