Latest in Gear

Image credit: KENGKAT via Getty Images

Facial recognition linked to a second wrongful arrest by Detroit police

It's the second-known such case in the US to date.
Kris Holt, @krisholt
July 10, 2020
431 Shares
Share
Tweet
Share

Sponsored Links

AI (artificial intelligence) concept, machine learning, nanotechnologies and face recognition concept, Interactive artificial intelligence digital advertisement in event exhibition hall, CCTV camera
KENGKAT via Getty Images

A false facial recognition match has led to the arrest of another innocent person. According to the Detroit Free Press, police in the city arrested a man for allegedly reaching into a person’s car, taking their phone and throwing it, breaking the case and damaging the screen in the process.

Facial recognition flagged Michael Oliver as a possible suspect, and the victim identified him in a photo lineup as the person who damaged their phone. Oliver was charged with a felony count of larceny over the May 2019 incident. He said he didn’t commit the crime and the evidence supported his claim.

The perpetrator, who was recorded in footage captured on a phone, doesn’t look like Oliver. For one thing, he has tattoos on his arms, and there aren’t any visible on the person in the video. When Oliver’s attorney took photos of him to the victim and an assistant prosecutor, they agreed Oliver had been misidentified. A judge later dismissed the case.

Facial recognition tech was used in Oliver’s case before new rules came into force. Detroit police can now only use it to investigate violent felonies. Wayne County’s top persecutor will also review all facial recognition cases when an assistant prosecuting attorney and supervisor agree that charges should be laid.

This is not the only time officers in the city have wrongfully arrested someone following a false match. In a high-profile case earlier this year, they arrested and detained Robert Williams for almost 30 hours for a crime he didn’t commit. These are the first two known cases of wrongful arrests to stem from false facial recognition matches.

Late last month, Detroit Police Chief James Craig suggested the technology the department uses, which was created by DataWorks Plus, isn’t always reliable. “If we were just to use the technology by itself, to identify someone, I would say 96 percent of the time it would misidentify,” he said in a public meeting, according to Motherboard. From the start of the year through June 22nd, the force used the software 70 times per the department’s public data. In all but two of those cases, the person whose image the technology analyzed was Black.

Oliver and Williams are both Black men. Various studies have suggested there are elements of racial bias in facial recognition tech.

“Detroit police’s new policy is a fig leaf that provides little to no protection against a dangerous technology subjecting an untold number of people to the disasters that Robert Williams and Michael Oliver have already experienced,” American Civil Liberties Union of Michigan legal director Dan Korobkin told Engadget in a statement.

“Lawmakers must take urgent action to stop law enforcement use of this technology until it can be determined what policy, if any, can effectively prevent this technology's harms. At the same time, police and prosecutors nationwide should review all cases involving the use of this technology and should notify all individuals charged as a result of it. This technology is dangerous when wrong and dangerous when right.”

There have been calls at varying levels of government to ban police use of facial recognition tech, including from Black Democrats in the Michigan House of Representatives. Several cities, including Boston and San Francisco, have banned or limited the use of facial recognition. Members of Congress filed a bill last month that seeks to “prohibit biometric surveillance by the federal government without explicit statutory authorization.”

Some tech companies that have worked on facial recognition have reassessed their positions on the tech. IBM says it’ll no longer develop “general purpose” facial recognition due to human rights concerns. Amazon has paused police use of Rekognition, while Microsoft won’t sell its facial recognition tech to police departments until there are federal rules “grounded in human rights.”

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.
Share
431 Shares
Share
Tweet
Share

Popular on Engadget

Engadget's 2020 Back-to-School Guide

Engadget's 2020 Back-to-School Guide

View
Disney has no idea what it's doing with 'Mulan'

Disney has no idea what it's doing with 'Mulan'

View
Instagram 'bug' heavily favored Trump content over Biden for months

Instagram 'bug' heavily favored Trump content over Biden for months

View
The Morning After: Samsung revealed the Note 20 and Galaxy Z Fold 2

The Morning After: Samsung revealed the Note 20 and Galaxy Z Fold 2

View
Google adds emoji reactions to Messages on Android

Google adds emoji reactions to Messages on Android

View

From around the web

Page 1Page 1ear iconeye iconFill 23text filevr