top of page

Hidalgo County

Bail Bond Association 

Writer's pictureRene Anzaldua

Florida Legislature Should Reject New Bail Reform Risk Assessment Algorithm Proposals (SB-484)

With a recent amendment to Senate Bill 484, the conversation of risk assessments and bail reform has come directly to Florida, and regardless of how one feels about bail reform in general, the Legislature should be aware of the many pitfalls of going in this direction.

Using artificial intelligence in the criminal justice system has been all the rage for the last couple of years.  As one expert has noted, we can no longer prepare for the arrival of the criminal algorithm age—we are living in it right now.  While bail reform conversations are happening nationally and there are a litany of issues that might count as bail reform, the central component of the bail reform conversation is the use of criminal algorithms that are supposed to predict the risk of defendants so that judges can decide who gets out of jail and who stays in jail pending trial.

First, in a first-of-its kind study of risk assessments in practice issued in December, 2017, Professor Megan Stevenson at George Mason University School of law found that the results were dubious.  She began by noting that “virtually nothing is known about how the implementation of risk assessment affects key outcomes: incarceration rates, crime, misconduct, or racial disparities.”  After looking at several years’ worth of data from Kentucky and other jurisdictions, she concluded that risk assessments only lead to a “trivial decrease in the incarcerated population,” that they “had no effect on racial disparities in pretrial detention,” and that “failures-to-appear and pretrial crime increased as well.”

Other recent studies call into question whether the expense and time of implementing risk assessments is worth it in light of the fact that they are not particularly predictive.  One recent study by researchers at Dartmouth University found that untrained persons on-line can do equally as good at predicting new crimes as a widely-used risk assessment algorithm in the criminal justice system.  “There was essentially no difference between people responding to an online survey for a buck and this commercial software being used in the courts,” said the researchers. “If this software is only as accurate as untrained people responding to an online survey, I think the courts should consider that when trying to decide how much weight to put on them in making decisions.”  Of course, the Florida Legislature might want to think twice about encouraging these wasteful practices as well.

Aside from the fact that the risk assessments don’t work, there have also been a litany of problems related to the transparency of risk assessments including how they were built and what information upon which they are based.  In a recent story on wired.com, it was noted that there “widespread transparency problem with state and municipal use of predictive algorithms.”  In many cases, the underlying data and analytics used to build the algorithms are secret and beyond public view.  Yet, legislation currently being considered would encourage local jurisdictions in Florida to sign contracts with providers of proprietary algorithms, such as the Laura and John Arnold Foundation Public Safety Assessment and others, which are protected trade secrets and intentionally concealed from public view.  For example, two law professors found that “the Arnold Foundation has not revealed how it generated the algorithms, or whether it performed pre- or post-implementation validation tests, and if so, what the outcomes were.”

Other researchers from across the nation have called for greater transparency and elimination of black-box and secret algorithms.  For example, researchers at New York University concluded as follows:

Core public agencies, such as those responsible for criminal justice, healthcare, welfare,   and education (e.g “high stakes” domains) should no longer use “black box” AI and algorithmic systems. This includes the unreviewed or unvalidated use of pre-trained models, AI systems licensed from third party vendors, and algorithmic processes created in-house. The use of such systems by public agencies raises serious due process concerns, and at a minimum they should be available for public auditing, testing, and review, and subject to accountability standards.

In Florida, however, Senate Bill 484 as amended actually encourages this practice statewide rather than prohibit it and does very little to impose accountability standards other than requiring the government to validate the software it intends to use.

The Florida legislature should take a pause and investigate the claims of proprietors of these risk assessment software packages before it is too late.  We believe that the legislature will find that they are a waste of money, create inefficiency and waste in the judicial process, and, like in the case of Kentucky, can have the direct opposite effect of what is trying to be achieved.   

Jeff Clayton, Executive Director, American Bail Coalition

 

2 views0 comments

Comments


bottom of page