Latest in Tomorrow

Image credit: izusek via Getty Images

UK police's facial recognition system has an 81 percent error rate

But officials inside the Metropolitan Police say otherwise.
2304 Shares
Share
Tweet
Share
Save

Sponsored Links

izusek via Getty Images

Facial recognition technology is mistakenly targeting four out of five innocent people as wanted suspects, according to findings from the University of Essex. The report -- which was commissioned by Scotland Yard -- found that the technology used by the UK's Metropolitan Police is 81 percent inaccurate and concludes that it is "highly possible" the system would be found unlawful if challenged in court.

The report, obtained by Sky News, is the first independent evaluation of the scheme since the technology was first used at Notting Hill Carnival in August 2016. Since then it has been used at 10 locations, including Leicester Square and during Remembrance Sunday services. Researchers measured the accuracy of the technology from six of these locations and found that of 42 "suspect matches," only eight were correct, giving an error rate of 81 percent.

However, the Met measures accuracy in a different way, by comparing successful and unsuccessful matches with the total number of faces processed by the system. If interpreted in this way, the error rate is just 0.1 percent. In response to the report, the Met's deputy assistant commissioner, Duncan Ball, said the force was "extremely disappointed with the negative and unbalanced tone of this report." The authors of the report, meanwhile, said their findings posed "significant concerns." This is not the first time UK police have come under fire for such inaccuracies -- in 2018 South Wales Police misidentified 2,300 people as potential criminals.

The use of facial recognition technology has skyrocketed in recent times, with systems being installed in public transport hubs and at large events. Despite some apparent "successes" -- such as the identification of an illegal traveler at Dulles airport just three days after the system was launched -- the technology continues to pose a number of ethical and legal dilemmas. In China, for example, facial recognition is being used to monitor ethnic minorities and track children's classroom behavior. Meanwhile, a number of tech giants have made clear their apprehensions about the technology. Microsoft has been outspoken about its desire for proper regulation, while both Apple and Google have expressed similar concerns. As this new report demonstrates, the technology still has a long way to go before it can be considered truly reliable.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.
Comment
Comments
Share
2304 Shares
Share
Tweet
Share
Save

Popular on Engadget

Russia reportedly breached encrypted FBI comms in 2010

Russia reportedly breached encrypted FBI comms in 2010

View
Elon Musk insists 'pedo guy' tweet wasn’t serious accusation

Elon Musk insists 'pedo guy' tweet wasn’t serious accusation

View
Nintendo's SNES-style Switch controllers are now available

Nintendo's SNES-style Switch controllers are now available

View
Mazda will show off its first EV at the Tokyo Motor Show

Mazda will show off its first EV at the Tokyo Motor Show

View
US Senators ask the FCC to review licenses with China-owned telecoms

US Senators ask the FCC to review licenses with China-owned telecoms

View

From around the web

Page 1Page 1ear iconeye iconFill 23text filevr