AI EXPERTS WANT TO END ‘BLACK BOX’ ALGORITHMS IN GOVERNMENT

Computerized Justice is No Justice

AI EXPERTS WANT TO END ‘BLACK BOX’ ALGORITHMS IN GOVERNMENT

According to a report by Wired Magazine, leading researchers of artificial intelligence are warning on the perils of relying on algorithms and artificial intelligence in government…and we couldn’t agree more.

The AI Now report calls for agencies to refrain from what it calls “black box” systems opaque to outside scrutiny. Kate Crawford, a researcher at Microsoft and cofounder of AI Now, says citizens should be able to know how systems making decisions about them operate and have been tested or validated.


AI EXPERTS WANT TO END ‘BLACK BOX’ ALGORITHMS IN GOVERNMENT

(excerpt from Wired.com – Oct 18 2017)

THE RIGHT TO due process was inscribed into the US constitution with a pen. A new report from leading researchers in artificial intelligence cautions it is now being undermined by computer code.

Public agencies responsible for areas such as criminal justice, health, and welfare increasingly use scoring systems and software to steer or make decisions on life-changing events like granting bail, sentencing, enforcement, and prioritizing services. The report from AI Now, a research institute at NYU that studies the social implications of artificial intelligence, says too many of those systems are opaque to the citizens they hold power over.

The AI Now report calls for agencies to refrain from what it calls “black box” systems opaque to outside scrutiny. Kate Crawford, a researcher at Microsoft and cofounder of AI Now, says citizens should be able to know how systems making decisions about them operate and have been tested or validated. Such systems are expected to get more complex as technologies such as machine learning used by tech companies become more widely available.

“We should have equivalent due-process protections for algorithmic decisions as for human decisions,” Crawford says. She says it can be possible to disclose information about systems and their performance without disclosing their code, which is sometimes protected intellectual property.

read more…


 

One Response

  1. compjustice says:

    Risk algorithms threaten due process and a number of related civil liberties – we cannot trust what is not transparent.

Comments are closed.