The State Is Trying to Fix the Biased Bail System—but Do Reform Efforts Go Far Enough?
New Jersey is trying a new algorithm to fix its broken bail system, a flashpoint for criminal justice advocates who argue that court-assessed fines can discriminate against low-income and highly policed communities—most often, people of color.
Guidelines for how judges set bail vary across the country, but generally use a combination of a bail schedule, which prices out fees for specific offenses, and their own assessment of whether the defendant will appear at their hearing or commit a crime before their trial. If you can't pay up, you stay in jail until your trial date, sometimes for up to a month.
On January 1, New Jersey replaced its bail system with an algorithm designed to mathematically assess the risk of defendants fleeing or committing a crime—particularly a violent one—before their trial date. The algorithm, called the Public Safety Assessment, was designed by the Texas-based Laura and John Arnold Foundation, a nonprofit that tries to fund innovative solutions to criminal justice reform.
New Jersey isn't the first state to use algorithms to help judges suss out high-risk defendants. Counties across the country have tried using computer-based techniques to flag those who should continue to be detained until trial, and those who are flight risks.
But the algorithms are not without flaws. Last year, investigative reporting by ProPublica revealed that these programs had in-built racial biases, too. The software assessed risk based on data points gleaned from interviews with defendants, including questions about ZIP codes, educational attainment, and family history of incarceration—all of which can serve as proxies for race.
This system is different, according to Matt Alsdorf, vice president of the foundation's Criminal Justice Initiative. The initiative assembled a dataset of more than 100,000 individual cases, and looked for factors re-offenders had in common. His team found that the data points that were the most closely correlated with race weren't actually terribly useful. However, the foundation has yet to release the dataset for public analysis.
"The strongest predictor of pretrial failure largely has to do with someone's prior conduct," he said. The algorithm uses conviction records instead of arrest records, which are less likely to tip the scales against individuals in heavily policed neighborhoods—studies have found that the arrest rate for black people can be up to ten times higher than for non-blacks.
Alsdorf, meanwhile, conceded that conviction records in particular aren't actually race-blind. Advocates contend that there's a significant disparity in conviction rates for black and white defendants, though the Bureau of Justice Statistics doesn't release race data on conviction rates.
Cathy O'Neil, former Barnard College math professor and author of Weapons of Math Destruction, says there is still a concern about over-policing of black neighborhoods in general, particularly in the case of nonviolent offenses. And she's not sure how the algorithm will fix the issue.
"I'm glad to see that they're making an effort," she said, "but they haven't actually opened up to audit." The Arnold Foundation has released their methodology, but hasn't gone through the kind of rigorous audit ProPublica performed on the COMPAS program by the private company Northpointe, which also used algorithmic risk assessments.
There's also a question of whether it's fair to dole out punitive measures—especially ones that could lead to extended pretrial detentions—on predictive policing.
Judges are already using their own judgments as a kind of predictive policing, Alsdorf argues, and the results aren't much better. "Those systems are the ones that generate the status quo," he said. "What we see in the status quo is significant disparities in detention rates of white people and people of color."
In terms of failure to appear at a trial, O'Neil said that it's possible to set a standard that everyone can live with—and that might involve more innovative solutions.
Share This Article