Ask critics of police face recognition why they're so skeptical and they'll likely cite unreliability as one factor. What if the technology flags an innocent person? Unfortunately, that caution appears to have been warranted to some degree. South Wales Police are facing a backlash after they released data showing that their face recognition trial at the 2017 Champions League final misidentified thousands as potential criminals. Out out of 2,470 initial matches, 2,297 were false positives -- about 92 percent.
The police unit pinned the results on both "poor quality images" from Interpol and UEFA and the novelty of the technology. "No facial recognition system is 100% accurate under all conditions," the force wrote. It was also quick to defend the overall track record, noting that there had been 2,000 positive IDs in nine months with 450 arrests and no people mistakenly taken into custody. The accuracy is believed to be improving, although there have been false positives around events since last June.
The absence of mistaken arrests supports the force's claim that there are extensive safeguards. Officers still check the initial alerts to see if they're authentic, and an "intervention team" can interact with the target to determine whether or not it's the right person. Even so, the gap between the number of potential matches at the Champions League final and actual arrests is hard to ignore. Privacy issues notwithstanding, there's a chance those false positives could bog down officers who are looking for suspects in urgent situations.