Back to News & Commentary

Secret Algorithms Are Deciding Criminal Trials and We’re Not Even Allowed to Test Their Accuracy

DNA double helix on computer screen
DNA double helix on computer screen
Vera Eidelman,
Staff Attorney,
ACLU Speech, Privacy, and Technology Project
Share This Page
September 15, 2017

In today’s world, computerized algorithms are everywhere: They can decide whether you get a , how much you access, and what you see. And, increasingly, it’s not just private companies that use algorithms. The , is turning to proprietary algorithms to make profound decisions about your life, from what level of health benefits you receive to whether or not you get .

This isn’t necessarily good or bad. At their core, “algorithms” are just instructions, like a recipe or user manual, that use raw inputs to determine outcomes in all kinds of decision making. But it becomes a serious problem when the government keeps those algorithms — including the source code that executes the programs and the raw data that constitutes their inputs — secret from the public.

And that’s exactly what is happening in criminal trials around the country.

Take, for example, the case of Billy Ray Johnson, who was sentenced to life in prison without parole for a series of burglaries and sexual assaults he says he did not commit, largely based on the results of a proprietary algorithm called TrueAllele. TrueAllele claims to identify the perpetrator of a crime from a tiny, degraded DNA sample swimming in a larger soup of multiple individuals’ DNA. It’s an experimental technology, not at all like the DNA tests that have developed over the past two decades, which also have serious flaws. At Mr. Johnson’s trial, the court denied the defense team access to TrueAllele’s source code — information crucial to the defense case — all because the company that owns it cried, “Trade secret!”

As we explained in an amicus brief we filed in the case on Wednesday, this is unconstitutional in a number of ways. Our Constitution gives a defendant the right to confront the witnesses against him, and it provides him with the right to a fundamentally fair trial that includes a meaningful opportunity to present a complete defense. It also gives the public a right of access to criminal proceedings, including evidence, so that we can serve as a check upon the judicial process.

Access to the source code of algorithms used in the criminal justice system is critical to ensure fairness and justice. Algorithms are human constructs that are , which can plague them throughout their design and use. For example, at the building stage, something as simple as a misplaced ampersand can have profound implications. A coding error in another DNA algorithm was recently found to have produced in Australia, altering its reported statistics by a factor of 10 and forcing prosecutors to replace 24 expert statements.

Beyond random mistakes, people hold that can materially affect the variables they include in an algorithm, as well as how they interpret the results. Racial bias also often creeps into algorithms, both because the underlying data reflects existing racial disparities and because inaccurate results for smaller minority groups may be hidden in overall results.

And, of course, there’s the possibility that financial incentives will pervert the goals of companies that build these algorithms. In the context of DNA typing, the prosecution, backed by the substantial resources of the state, is a company’s most likely customer — and that customer is likely to be most satisfied with an algorithm that delivers a match. So companies may build programs to skew toward matches over the truth.

In Mr. Johnson’s case, the trial court decided to ignore these potential pitfalls — and, more significantly, the defendant’s constitutional rights — ruling in favor of TrueAllele’s argument for secrecy. This is legally wrong and has troubling practical implications. Research shows that juries put too much trust in uncontested algorithms. Prosecutors and their expert witnesses present their results as infallible truth, which go “.” And juries, when given no other option, generally do not question them.

But the results need to be questioned, and this case demonstrates why.

TrueAllele’s parent company, Cybergenetics, and a government lab that bought the algorithm to run in-house got wildly different results — both from themselves on different test runs and from each other overall. Indeed, TrueAlelle’s creator testified that he expected the government’s results, generated by running the same data through the same program, to be “within two zeros,” or a magnitude of 100, of his results. Yet even though he expected a significant discrepancy, he was able to offer his results as unquestioned evidence. All while the defense was given no meaningful opportunity to challenge his testimony.

Access to similar DNA algorithms has revealed serious errors in them. Much like the example from Australia, a recent case in New York revealed that another DNA algorithm “ from its calculations, in ways . . . that could unpredictably affect the likelihood assigned to the defendant’s DNA being in the mixture.” This was only discovered after the trial court correctly ordered that the algorithmic source code be disclosed to the defense, prompting the prosecution to withdraw the evidence. Yet courts continue to admit the results of other DNA algorithms, like TrueAllele, without disclosure to the defense or the public.

This isn’t the first time we’ve been down this road with technology in criminal courts. There is a long history of junk science being used under the guise of technological advance. Public access to such evidence was a prerequisite to establishing its invalidity.

In the 1990s, “a series of high-profile legal challenges” and “ of forensic evidence” caused various long-standing methods — from bite-mark analysis to ballistics testing and from fingerprinting to microscopic-hair-comparison — to get “deflated or outright debunked.” Similarly, after a New Yorker exposed a flawed case based on arson science, the state responsible not only “reconsider[ed] old cases that had been by the original investigators,” but also “reinvented itself as a leader in arson science and investigation.”

Scientific errors in the criminal justice system are a serious problem. But the examples above also reveal the power of adversarial testing and public scrutiny to correct those errors and create better science.

We hope the California appellate court agrees with us and orders disclosure of the algorithmic source code. An adversarial testing process is crucial to ensure that Mr. Johnson’s constitutional rights are enforced.

Learn More About the Issues on This Page