West Virginia Bail Reform Legislation would Encourage the Use of Unproven Criminal Risk Algorithms and Delegate Power to the Courts to Adopt the No-Money Bail System.
by Jeff Clayton, Executive Director, American Bail Coalition
Conversations on bail reform all are all the new rage.
That we can simply use computers to safely and efficiently reduce mass incarceration is the believed trend, although there are fundamental cracks beginning to form in the foundation that we use computer software to accurately predict who is going to commit a new crime or failure to appear in Court without trammeling on the rights of criminal defendants.
Of course, adopting the get-out-of-jail-free bail reform plan was indeed the trend last year, where New Jersey and New Mexico adopted no-money bail systems that have now left those communities powerless to stop a wave of recidivist defendants using weakened bail systems to dodge accountability and re-victimize the community. This year, eleven State Attorneys General and the U.S. Justice Department are standing in the way.
Assembly Bill 4173 would computerize the bail system in West Virginia, tying the hands of judges in favor of new bail computers.
First, in a first-of-its kind study of risk assessments in practice issued in December, 2017, Professor Megan Stevenson at George Mason University School of law found that the results were dubious. She began by noting that “virtually nothing is known about how the implementation of risk assessment affects key outcomes: incarceration rates, crime, misconduct, or racial disparities.” After looking at several years’ worth of data from Kentucky and other jurisdictions, she concluded that risk assessments only lead to a “trivial decrease in the incarcerated population,” that they “had no effect on racial disparities in pretrial detention,” and that “failures-to-appear and pretrial crime increased as well.”
Other recent studies call into question whether the expense and time of implementing risk assessments is worth it in light of the fact that they are not particularly predictive. One recent study by researchers at Dartmouth University found that untrained persons on-line can do equally as good at predicting new crimes as a widely-used risk assessment algorithm in the criminal justice system. “There was essentially no difference between people responding to an online survey for a buck and this commercial software being used in the courts,” said the researchers. “If this software is only as accurate as untrained people responding to an online survey, I think the courts should consider that when trying to decide how much weight to put on them in making decisions.” Of course, the West Virginia Legislature might want to think twice about encouraging these wasteful practices as well.
Aside from the fact that the risk assessments don’t work, there have also been a litany of problems related to the transparency of risk assessments including how they were built and what information upon which they are based. In a recent story on wired.com, it was noted that there “widespread transparency problem with state and municipal use of predictive algorithms.”
In many cases, the underlying data and analytics used to build the algorithms are secret and beyond public view. For example, found two law professors found that “the Arnold Foundation has not revealed how it generated the algorithms, or whether it performed pre- or post-implementation validation tests, and if so, what the outcomes were.”
Other researchers from across the nation have called for greater transparency and elimination of black-box and secret algorithms. For example, researchers at New York University concluded as follows:
Core public agencies, such as those responsible for criminal justice, healthcare, welfare, and education (e.g “high stakes” domains) should no longer use “black box” AI and algorithmic systems. This includes the unreviewed or unvalidated use of pre-trained models, AI systems licensed from third party vendors, and algorithmic processes created in-house. The use of such systems by public agencies raises serious due process concerns, and at a minimum they should be available for public auditing, testing, and review, and subject to accountability standards.
In West Virginia, Assembly Bill 4173 would actually encourage this practice statewide rather than prohibit it and does very little to impose accountability standards like those suggested.
Assembly Bill 4160 would give the courts a blank check to enact a rule to implement a no-money bail system.
States trying the no-money bail system are not doing well with it. The New Mexico Governor is calling for the repeal of the system. Law enforcement, mayors, legislators, and other public officials in New Jersey are calling for a reform of their no-money bail system, where repeat recidivism is running rampant because the only choice for judges is detention or release. In Harris County, Texas, the direct implementation of this free bail equal protection concept invented by Eric Holder has resulted in an explosion of failures to appear in court and an uptick in crime while on bail. The latest statistics showed that 45% of all misdemeanor defendants released on a free bail failed to appear in court, which was three times the rate of cash bonds, and six times the rate of surety bonds. Not to mention that California, Connecticut, New York, and Texas directly rejected similar no-money bail plans last year.
There are always ways to improve the criminal justice system. To try to fix the bail system with unproven computer software systems that may make the problem worse is not sound public policy. Nor is steaming forward with a plan to delegate power to the Supreme Court to adopt the no-money bail system.
Comments