Ohio, long championing itself as a leader in the use of validated risk assessment tools is, in fact, open to serious scrutiny of their risk model.
August 14, 2018
There is continuing opposition to the use of risk assessments in what might be called the cradle-to-grave of the criminal justice system.
Currently, there is an explosion of the use of technological tools that look more like social control than solutions that start from telling cops where to arrest all the way through sentencing algorithms and whether someone gets probation or parole.
Ohio has long championed itself as a leader on these issues and with Cuyahoga County on the verge of adopting similar tools for pretrial release, we thought we would do a little digging to see how well Ohio measures up.
Ohio uses ten risk assessment tools, according to the Ohio Department of Rehabilitation and Correction. We decided to score the ten tools on following five factors as we began our investigation into the tool: (1) is it valid according to national best practices; (2) when was it last validated; (3) was it validated; (4) was it tested for racial bias; and, (5) has it been shown to be racially neutral according to national best practices.
The results of this are in the table below, and we’ll explain how we obtained the results.
During our research, we obtained the validation report for ORAS (Ohio Risk Assessment System, which is the framework for the 10 tools).
In reviewing that report, we noted the report validated only five of the ten tools. So, by contacting the Ohio Department of Rehabilitation and Corrections, we discovered the 2009 validation report is the only validation report there is for the ORAS suite of ten tools. If you don’t believe us, call over and ask for all of the validation reports for any or all of the ORAS tools. This a huge problem because the 2009 validation report notes that national best practices require revalidation every 18-24 months, something Ohio has now not done for a decade. As a result, every day invalid tools are being used in Ohio.
So, by contacting the Ohio Department of Rehabilitation and Corrections, we discovered the 2009 validation report is the only validation report there is for the ORAS suite of ten tools.
In reviewing the 2009 report, aside from the severe data limitations noted, the study never asked the question of whether the tool predicted equally among racial groups, whether using one of any number of approaches to test for racial bias. In fact, the report does not even consider racial bias whatsoever in any of the tools, but instead expresses concern for either under or over-representation in the sample.
Indeed, this is the state of affairs in algorithmic justice in America. Ten “tools,” developed by a number of government tools and their contractors, five of which were never validated in the first place, five of which were validated in 2009, all 10 of which are currently not valid.
Sadly, some could say the entire system is racially biased, and because Ohio never asked the question, we just don’t know.
It’s time to end secret profiling and big data in criminal justice.