Algorithmic pretrial risk assessment may just be more common than you think

Californians recently voted to reject Proposition 25, which sought to replace cash bail throughout the state with algorithmic risk assessments. But, like it or not, government agencies are already moving forward with algorithmic pretrial reform efforts. For example, the Los Angeles Superior Court piloted a program in March 2020 that utilizes a tool to calculate a defendant’s risk of failing to appear in court and recidivating pretrial. Outside California, the New York City Criminal Justice Agency has a similar release assessment that draws on data from over 1.6 million previous cases to calculate a risk score that informs judges’ pretrial decisions. And communities like Pierce County in Washington State are working with the National Partnership for Pretrial Justice to develop, implement and research pretrial risk assessment systems

Proponents of pretrial risk assessment argue that algorithms can be used to address issues of mass incarceration, inefficiency, and inequity in the criminal justice system. The aforementioned pilot program in Los Angeles was used to rapidly reduce the county’s incarcerated population in response to the COVID-19 pandemic, for example. The New York City Criminal Justice Agency said its release assessment could help alleviate the city’s backlog of pending cases, according to recent Wall Street Journal coverage, and the National Partnership for Pretrial Justice similarly hopes to use risk scores to support fairness in judicial decision making

More generally, according to a 2020 report by the Brennan Center, over 70% of the American prison population (about 536,000 people) are pretrial detainees, and many of these unconvicted individuals are only detained while awaiting trial because they can’t afford bail. Making pretrial detention decisions based on data-based risk assessments rather than ability to pay bail would stop this system of wealth-based discrimination, according to proponents of California’s Proposition 25, who hoped to implement pretrial assessment systems and eliminate money bail throughout the state. 

Others, however, argue that pretrial risk assessments do not help judges make more accurate, unbiased decisions. Opponents of such systems include not only those that oppose eliminating money bail (such as the bail bond industry and some law enforcement agencies); rather, many civil rights organizations that advocate for criminal justice reform are also against the adoption of pretrial risk assessments. In 2018, a coalition of over 100 civil rights, digital justice, and community-based organizations published a statement of concerns about embedding algorithmic decision making in the criminal justice system. 

Many academics also echo this skepticism. In 2019, 27 prominent researchers signed an open statement voicing concerns over “serious technical flaws” that undermine the accuracy, validity and effectiveness of actuarial pretrial risk assessments. More specifically, like many civil rights advocates, they argued such systems cannot adequately measure the risks that judges decide on. Instead, computer-based risk evaluations ultimately perpetuate historical racial inequities in the criminal justice system. 

Some government agencies and risk assessment developers have made efforts to bring transparency to these pretrial systems, so researchers and journalists could first search for readily available information before filing Freedom of Information Act requests. Legislation that would implement more of these algorithms is also something to keep an eye on. California’s Proposition 25, for example, presented the possibility that every county in the state would have to adopt pretrial assessment systems, each of which would have been important to examine in detail. Furthermore, computer-based risk assessments are also used in other areas of the criminal justice system, including recidivism reduction algorithms used at the federal level

Leave a comment

Your email address will not be published. Required fields are marked *