Credit: Byelikova | Dreamstime.com
The FBI has a new headline-grabbing—and hair-raising—scandal connected to faulty forensics practices. This is not the first time the science prosecutors have used to put people behind bars has proven to be deeply, seriously faulty, and it is another reason to push for reforms and better oversight.
The FBI has admitted that its hair examiners have been dishing out clap-trap in court and in their reports prior to 2000 when describing hair analysis in hundreds of cases—though the flawed science involved could affect as many as 3,000 cases. The FBI itself quotes Peter Neufeld, co-founder of the Innocence Project: "These findings confirm that FBI microscopic hair analysts committed widespread, systematic error, grossly exaggerating the significance of their data under oath with the consequence of unfairly bolstering the prosecutions' case."
The FBI hair comparison experts were found to have made "erroneous statements" in about 96 percent of the studied cases in which "examiners provided testimony used to inculpate a defendant at trial." In 33 cases, errors were found in the analysis of defendants who were subsequently sentenced to death. Of those defendants, nine have already been executed, and five others died while on death row.
Worse still, this bad hair science is just the latest example of phony forensics. Another whole field of forensic science, compositional bullet lead analysis, was shown to be bogus in a 2004 National Academy of Sciences study. The FBI had been testifying that the chemical composition of a bullet could identify it down to the maker, or even the batch, or even, in some cases, the box. No, said the study: "The available data do not support any statement that a crime bullet came from a particular box of ammunition."
And then there's the heavily questioned science of bite-mark analysis, heavily reported by former Reason editor Radley Balko. A 2002 study found a "false positive" error rate of 64 percent in bite-mark analysis. The Chicago Tribune reported that the study's author "figured that on average, they falsely identified an innocent person as the biter nearly two-thirds of the time." Fortunately, the FBI does not do bite-mark analysis. Unfortunately, other labs do.
A 1992 study showed that many traditional arson investigation techniques were bogus. And yet Texas convicted Cameron Todd Willingham of murder mostly on the basis of those very techniques. In 2004, 12 years after the release of the report discrediting the crucial techniques used in Willingham's case, he was executed for his supposed crime.
Even fingerprints and DNA can go wrong. Fingerprints are pretty reliable when both the "known" and "unknown" images are clear and distinct. But the "unknown" image is often far from clear and distinct. The unknown image might be smudged, a small partial print, overlain by other possibly smudged prints, or deposited on an irregular surface like wood grain. In those cases errors become more likely.
In 2004, the FBI made a "100 percent match" of a print from the deadly Madrid train bombing to Portland area lawyer Brandon Mayfield. They turned out to be 100 percent wrong, however. The FBI later apologized to Mayfield, who claimed to have been profiled because he was a convert to Islam, and paid out $2 million to settle a suit he had filed against them. In another famous misidentification, that of Shirley McKie, a Scottish police agency was found to have mistaken wood grain for fingerprint ridges!
In ideal conditions, DNA is our most reliable forensic technique. Conditions are less than ideal if the crime-scene sample is small or corrupted or if it has the DNA of more than one person mixed together in it. And we have seen mistakes there, too. Josiah Sutton was convicted of rape largely on DNA evidence that was later shown to be bogus. He was convicted and imprisoned at the age of 16 and released more than four years later.
What in the world is going on here? It's partly bad science, partly bad organization, and wholly unacceptable.
On television shows, forensic evidence is super-scientific and infallible. And yet we have seen over and over again in the real world decidedly unscientific techniques being used. A 2009 study by the National Academy of Sciences was blunt and plainspoken: "The bottom line is simple: in a number of forensic science disciplines, forensic science professionals have yet to establish either the validity of their approach or the accuracy of their conclusions, and the courts have been utterly ineffective in addressing this problem."
Subjective judgment is a big part of the problem. In hair microscopy, bullet-lead analysis, and fingerprints the forensic scientist is asked to make a subjective judgment of similarity. Even many cases of DNA analysis require subjective judgment. That doesn't seem very scientific.
And in most cases, public crime labs, such as the FBI lab, are a part of the law enforcement agency, not independent reviewers. This is where bad organization factors in. The forensic scientists are supposed to be neutral, but they're working for the cops. A 2013 study (that I co-authored) shows that many public crime labs are funded in part per conviction. And in 14 states, such a financing system is required by state law.
If you work for the cops, you're bound to end up seeing things from their angle no matter the desire for objectivity. That might be okay if forensic analysis actually was objective. It is not okay, however, when forensic analysis is subjective, which it often is. Add in a specific financial incentive to get convictions and you are pretty much asking crime labs to interpret everything as incriminating.
The organizational problems of forensic science are compounded by monopoly. Once evidence goes to a given crime lab, it is unlikely to be examined or interpreted by any other lab. Thus, we do not usually get a reality check on the work a crime lab does. That lets errors slip through undetected. And errors can go undetected for years or even, perhaps, forever.
With the right reforms, we can create a working system of checks and balances. Here are a few ideas Reason has promoted in the past. Maybe their time has finally come:
Cross-lab redundancy. A jurisdiction should contain several competing forensic labs. Some evidence should be chosen at random for multiple testing at other labs. This creates checks and balances.
Independence. Put crime labs under the department of health, not the cops.
Statistical review. Compare the results of different labs and look for statistical anomalies. An investigation may reveal bad practices to be eliminated or good practices to be emulated.
Sequential unmasking. Forensic scientists can be biased by scientifically irrelevant information such as the criminal history of the suspect. Sequential unmasking is an administrative control process similar to that used in double-blind research studies. It prevents forensic scientists from learning potentially biasing information until after they have made their determinations.
Forensic counsel for the indigent. Most criminal defendants cannot afford their own forensic experts. Basic fairness says that they should have a right to their own experts just like they have a right to counsel. A voucher system is the best way to provide defense experts.
Measures like these could help turn forensic science from a scandal pit into a source of improved criminal justice. That seems like a worthy goal in the context of today's troubled criminal justice system.
Comments