EJI: Risk Assessments Biased Against African Americans, Study Finds

Computerized Justice is No Justice

EJI: Risk Assessments Biased Against African Americans, Study Finds

Risk Assessments Biased Against African Americans, Study Finds


A report from Equal Justice Initiative examined the computer algorithms in pretrial release decisions and found that “computer algorithms that rate a defendant’s risk of future crime found they falsely labeled black defendants as future criminals at nearly twice the rate of white defendants.”

A study examining computer algorithms that rate a defendant’s risk of future crime found they falsely labeled black defendants as future criminals at nearly twice the rate of white defendants.

At the same time, white defendants were wrongly identified as low risk more often than black defendants.

Judges, prosecutors, prison officials, and parole officers increasingly are using risk assessments to make decisions at every stage in the criminal justice process, from pretrial release to sentencing to parole. Dozens of different risk assessment tools are in use nationwide, but few independent studies have evaluated their accuracy or investigated whether they are racially biased.

ProPublica examined risk scores assigned to more than 7000 people arrested in Broward County, Florida, in 2013 and 2014. The scores were produced by a for-profit company, Northpointe, whose software is one of the most widely used risk assessments in the country.

Looking at how many defendants were charged with new crimes over the next two years, which is the benchmark used by Northpointe, the study found that the risk scores were “remarkably unreliable in forecasting violent crime.” Only 20 percent of those predicted to commit a violent crime actually were charged with a subsequent violent offense.

Read more…