Civil Rights, Tech Groups Sound the Alarm About ‘Predictive Policing’

This article originally appeared on

By David McCabe


Major civil rights and technology organizations on Wednesday said they are worried about emerging technology that identifies potential perpetrators of crimes, fearing it could fuel racial bias in law enforcement.

Predictive policing technology is designed to highlight areas where crime may occur. The algorithms also attempt to spot people who are more likely to commit crimes.

Civil rights groups said dangers lurk in the use of the data.

“Automated predictions based on such biased data — although they may seem objective or neutral — will further intensify unwarranted discrepancies in enforcement,” the group said.

They also argued that the veneer of dispassionate technology could hide the threat of discrimination at law enforcement agencies.

“Predictive policing tools threaten to provide a misleading and undeserved imprimatur of impartiality for an institution that desperately needs fundamental change,” they said.

The groups questioned whether predictions from the systems might lead to unconstitutional searches and expressed broad fears that there is not enough transparency. The group also questioned why police departments were not using predictive technology to spot patterns of troubling behavior among their officers.

The message was led by the Leadership Conference on Civil & Human Rights and signed by organizations including the NAACP, the American Civil Liberties Union and the Electronic Frontier Foundation. It accompanies a report from the tech policy consultancy Upturn on predictive policing practices.

The pushback against predictive policing comes at a time when many have begun to consider the ways that algorithms may yield biased results against certain groups. The Leadership Conference previously released principles for using big data without violating civil rights.

The debate over biases in algorithms also ties into a broader conversation over diversity in tech. Many advocates argue that computer systems embedded with bias are heavily linked to workforces at tech companies that are overwhelmingly made up of white, male and Asian employees.

Leave a Reply

Your email address will not be published. Required fields are marked *