Latest in Tomorrow

Image credit: izusek via Getty Images

UK police's facial recognition system has an 81 percent error rate

But officials inside the Metropolitan Police say otherwise.
2249 Shares
Share
Tweet
Share
Save

Sponsored Links

izusek via Getty Images

Facial recognition technology is mistakenly targeting four out of five innocent people as wanted suspects, according to findings from the University of Essex. The report -- which was commissioned by Scotland Yard -- found that the technology used by the UK's Metropolitan Police is 81 percent inaccurate and concludes that it is "highly possible" the system would be found unlawful if challenged in court.

The report, obtained by Sky News, is the first independent evaluation of the scheme since the technology was first used at Notting Hill Carnival in August 2016. Since then it has been used at 10 locations, including Leicester Square and during Remembrance Sunday services. Researchers measured the accuracy of the technology from six of these locations and found that of 42 "suspect matches," only eight were correct, giving an error rate of 81 percent.

However, the Met measures accuracy in a different way, by comparing successful and unsuccessful matches with the total number of faces processed by the system. If interpreted in this way, the error rate is just 0.1 percent. In response to the report, the Met's deputy assistant commissioner, Duncan Ball, said the force was "extremely disappointed with the negative and unbalanced tone of this report." The authors of the report, meanwhile, said their findings posed "significant concerns." This is not the first time UK police have come under fire for such inaccuracies -- in 2018 South Wales Police misidentified 2,300 people as potential criminals.

The use of facial recognition technology has skyrocketed in recent times, with systems being installed in public transport hubs and at large events. Despite some apparent "successes" -- such as the identification of an illegal traveler at Dulles airport just three days after the system was launched -- the technology continues to pose a number of ethical and legal dilemmas. In China, for example, facial recognition is being used to monitor ethnic minorities and track children's classroom behavior. Meanwhile, a number of tech giants have made clear their apprehensions about the technology. Microsoft has been outspoken about its desire for proper regulation, while both Apple and Google have expressed similar concerns. As this new report demonstrates, the technology still has a long way to go before it can be considered truly reliable.

From around the web

Page 1Page 1ear iconeye iconFill 23text filevr