A false facial recognition match has led to the arrest of another innocent person. According to the Detroit Free Press, police in the city arrested a man for allegedly reaching into a person’s car, taking their phone and throwing it, breaking the case and damaging the screen in the process.
Facial recognition flagged Michael Oliver as a possible suspect, and the victim identified him in a photo lineup as the person who damaged their phone. Oliver was charged with a felony count of larceny over the May 2019 incident. He said he didn’t commit the crime and the evidence supported his claim.
The perpetrator, who was recorded in footage captured on a phone, doesn’t look like Oliver. For one thing, he has tattoos on his arms, and there aren’t any visible on the person in the video. When Oliver’s attorney took photos of him to the victim and an assistant prosecutor, they agreed Oliver had been misidentified. A judge later dismissed the case.
Facial recognition tech was used in Oliver’s case before new rules came into force. Detroit police can now only use it to investigate violent felonies. Wayne County’s top persecutor will also review all facial recognition cases when an assistant prosecuting attorney and supervisor agree that charges should be laid.
This is not the only time officers in the city have wrongfully arrested someone following a false match. In a high-profile case earlier this year, they arrested and detained Robert Williams for almost 30 hours for a crime he didn’t commit. These are the first two known cases of wrongful arrests to stem from false facial recognition matches.
Late last month, Detroit Police Chief James Craig suggested the technology the department uses, which was created by DataWorks Plus, isn’t always reliable. “If we were just to use the technology by itself, to identify someone, I would say 96 percent of the time it would misidentify,” he said in a public meeting, according to Motherboard. From the start of the year through June 22nd, the force used the software 70 times per the department’s public data. In all but two of those cases, the person whose image the technology analyzed was Black.
“Detroit police’s new policy is a fig leaf that provides little to no protection against a dangerous technology subjecting an untold number of people to the disasters that Robert Williams and Michael Oliver have already experienced,” American Civil Liberties Union of Michigan legal director Dan Korobkin told Engadget in a statement.
“Lawmakers must take urgent action to stop law enforcement use of this technology until it can be determined what policy, if any, can effectively prevent this technology's harms. At the same time, police and prosecutors nationwide should review all cases involving the use of this technology and should notify all individuals charged as a result of it. This technology is dangerous when wrong and dangerous when right.”
There have been calls at varying levels of government to ban police use of facial recognition tech, including from Black Democrats in the Michigan House of Representatives. Several cities, including Boston and San Francisco, have banned or limited the use of facial recognition. Members of Congress filed a bill last month that seeks to “prohibit biometric surveillance by the federal government without explicit statutory authorization.”
Some tech companies that have worked on facial recognition have reassessed their positions on the tech. IBM says it’ll no longer develop “general purpose” facial recognition due to human rights concerns. Amazon has paused police use of Rekognition, while Microsoft won’t sell its facial recognition tech to police departments until there are federal rules “grounded in human rights.”