This algorithm assigns scores to people based on their criminal records (ie arrests or shootings) as well as any known gang affiliations and other variables, according to the New York Times. Illinois Institute of Technology professor, Dr. Miles Wernick, created the algorithm. He told the NYT that while the system does look at a person's past criminal activities it specifically excludes biasing variables like race, gender, ethnicity and location.
With it, the CPD curated a 1,400-member "Strategic Subject List" that has already proven to be uncannily accurate. In 2016, over 70 percent of the people shot in the Second City have been on the list, as have 80 percent of the shooters. According to the CPD, 117 of the 140 people arrested during a city-wide gang raid performed last week were one the list as well.
The police aren't using this list to simply target individuals for arrest, the city also uses it to perform "custom notifications" in which social workers and community leaders meet with people who score high on the list and attempt to intervene, offering them a way out of gang life.
"The model just makes suggestions," Jonathan H. Lewin, deputy chief of the Chicago Police Department's technology and records group, told the NYT. "This is not designed to replace the human process. This is just designed to inform it."
However well intentioned the police's actions, the program has been met with suspicion from the community. "We're concerned about this," Karen Sheley, the director of the Police Practices Project of the American Civil Liberties Union of Illinois, told the NYT. "There's a database of citizens built on unknown factors, and there's no way for people to challenge being on the list. How do you get on the list in the first place? We think it's dangerous to single out somebody based on secret police information."
Sheley's fears are not unfounded. New York in the 1990s, for example, instituted a similar system dubbed CompSTAT. Its reputation as a crime-fighting tool has been marred however, by persistent rumors that NYPD brass manipulates the data and intentionally downgrades reports of violent crimes to lesser offenses. And even when these crime-predicting computers are being intentionally manipulated, there's the issue of inherent, if unintentional, bias present in their programming. Whether or not these algorithms will actually help reduce crime remains to be seen but our Minority Report future seems certain.