Citizens Against Computerized Justice

Visit Us on Facebook

Citizens Against Computerized Justice

End Secret Profiling in Criminal Justice

There is a huge push to use algorithms to solve our criminal justice problems.  Proponents and proprietors of such risk assessment tools think they can use computers to replace common sense and judicial discretion, all while preserving such abstract but important concepts such as justice.

Indeed, it is believed that we can tell the police where to go arrest and who to arrest; we can decide who gets pretrial release or not; and then we can decide who gets what sentence and who is offered a second chance - all by algorithm.

Yet, there are serious cracks in the computerized justice system, and it is time for states and jurisdictions using such risky algorithms to take a serious look at whether the use of these algorithms work, what impact they have on bias in the system, and evaluate the negative impact it may be having on outcomes.

On study found that predictive policing algorithms, including one used in California, “have sometimes seemed to recreate exactly the kind of racial biases their creators say they overcome.”

Biased policing is made worse by errors in pre-crime algorithms

When it comes to getting out on bail pending trial, states and jurisdictions are moving to require an algorithm to evaluate criminal defendants to decide if they get bail or not, and if they do, whether they’ll have their blood chemistry monitored, be on a GPS unit, or be subject to house arrest.  Yet, there has been widespread criticism that such algorithms are inherently biased.  In addition, algorithms that lack full transparency and are protected trade secrets are being used in jurisdictions all over the country preventing judges, defense lawyers, and prosecutors from seeing behind the curtain.  In one landmark study of the use of such algorithms, they didn’t decrease mass incarceration at all, only having a “trivial” decrease in the jail population.

In sentencing, algorithms have been shown to be biased against African-Americans.  In many states, these algorithms define who is suitable to be in the community and thus the algorithms’ results can directly influence whether a person gets prison or probation.

Risk Assessments Biased Against African Americans, Study Finds

These dangerous risk assessments need to finally be scrutinized and limited in their use and scope before it's too late.  Our mission is to do just that...

Learn More on the Issues...


 

Computerized Justice is No Justice at all...

End secret profiling in policing, sentencing, pretrial release, parole, and child welfare issues...TAKE ACTION with our national campaign effort to educate legislators and local officials.