, ,

William Thompson points out that [Mark] Perlin has declined to make public the algorithm that drives the program. “You do have a black-box situation happening here,” Thompson told me. “The data go in, and out comes the solution, and we’re not fully informed of what happened in between.”


When I interviewed Perlin at Cybergenetics headquarters, I raised the matter of transparency. He was visibly annoyed. He noted that he’d published detailed papers on the theory behind TrueAllele, and filed patent applications, too: “We have disclosed not the trade secrets of the source code or the engineering details, but the basic math.”

As a fan of open-source, this pisses me off. If a piece of software is critical to our lives, we should know how it works. As a fan of justice, though, this has me seeing red.

The Legal Aid Society of New York recently challenged a comparable software program, the Forensic Statistical Tool, which was developed in-house by the city’s Office of the Chief Medical Examiner. […]

In 2011, Legal Aid requested a hearing to question whether the software met the Frye standard of acceptance by the larger scientific community. To Goldthwaite and her team, it seemed at least plausible that a relatively untested tool, especially in analyzing very small and degraded samples (the FST, like TrueAllele, is sometimes used to analyze low-copy-number evidence), could be turning up allele matches where there were none, or missing others that might have led technicians to an entirely different conclusion. And because the source code was kept secret, jurors couldn’t know the actual likelihood of a false match.

At the hearing, bolstered by a range of expert testimony, Goldthwaite and her colleagues argued that the FST, far from being established science, was an unknown quantity. (The medical examiner’s office refused to provide Legal Aid with the details of its code; in the end, the team was compelled to reverse-engineer the algorithm to show its flaws.)

But this is just one piece of a much larger problem: the forensics technology used in the courtroom is coming under intense scrutiny, and it’s not holding up well under cross-examination. Even DNA evidence, which has quickly been promoted to the gold standard, is picking up a tarnish.

Last year, [Erin] Murphy published a book called Inside the Cell: The Dark Side of Forensic DNA, which recounts dozens of cases of DNA typing gone terribly wrong. Some veer close to farce, such as the 15-year hunt for the Phantom of Heilbronn, whose DNA had been found at more than 40 crime scenes in Europe in the 1990s and early 2000s. The DNA in question turned out to belong not to a serial killer, but to an Austrian factory worker who made testing swabs used by police throughout the region. And some are tragic, like the tale of Dwayne Jackson, an African American teenager who pleaded guilty to robbery in 2003 after being presented with damning DNA evidence, and was exonerated years later, in 2011, after a police department in Nevada admitted that its lab had accidentally swapped Jackson’s DNA with the real culprit’s.

Most troubling, Murphy details how quickly even a trace of DNA can now become the foundation of a case. In 2012, police in California arrested Lukis Anderson, a homeless man with a rap sheet of nonviolent crimes, on charges of murdering the millionaire Raveesh Kumra at his mansion in the foothills outside San Jose. The case against Anderson started when police matched biological matter found under Kumra’s fingernails to Anderson’s DNA in a database. Anderson was held in jail for five months before his lawyer was able to produce records showing that Anderson had been in detox at a local hospital at the time of the killing; it turned out that the same paramedics who responded to the distress call from Kumra’s mansion had treated Anderson earlier that night, and inadvertently transferred his DNA to the crime scene via an oxygen-monitoring device placed on Kumra’s hand.

But if I keep quoting the good bits, I’ll repost the entire article. Go read the original.