Orlando won't use Amazon's facial recognition software anymore

The city never got to the point of testing any of the images.

Orlando has canceled Amazon's facial recognition pilot following a series of technical problems and other issues, according to Orlando Weekly. The city started testing the facial recognition software -- which, according to a study, shows gender and race bias and tends to misidentify dark-skinned women -- back in 2017. After the first trial period expired in mid-2018, local officials showed hesitation in renewing the partnership before deciding to go through with a second pilot. Looks like the second go was just as unsatisfactory, though, because a memo sent to the City Council reportedly said the pilot didn't make "noticeable progress" and that Orlando doesn't have immediate plans to launch more facial recognition trials.

Orlando's Chief Administrative Office write in the memo:

"At this time, the city was not able to dedicate the resources to the pilot to enable us to make any noticeable progress toward completing the needed configuration and testing. [The city has] no immediate plans regarding future pilots to explore this type of facial recognition technology."

Amazon outfitted several cameras in the city with the Rekognition software, which requires users to upload images of the persons they want to keep an eye out for. The technology will then alert authorities if it spots a match within live surveillance streams. Local authorities enlisted a handful of policemen as test subjects and used four cameras at the police department's headquarters, three downtown and one outside a community recreation center for the test.

The pilot ultimately failed due to several factors, including bandwidth issues that prevented the staff from running the powerful software alongside more than one surveillance camera at the same time. Further, video feeds had the tendency to disconnect when the staff could get them to work. Rosa Akhtarkhavari, the person who headed the pilot, said they've "never gotten to the point to test images."

Orlando's cameras didn't have a big enough video resolution for the software to be able to recognize the test subjects, as well. Apparently, Amazon offered to supply cameras for the pilot, but Orlando declined. The cameras were also placed too high and often only showed the top of people's heads.

While city officials said they have no immediate plans to test facial recognition again, they also noted in the memo that they would continue exploring other advances in technology. By doing so, they're hoping to "further [support the] city's mission to become America's premier Future-Ready City."

Update: Matt Cagle, Technology and Civil Liberties attorney with the ACLU of Northern California, told us in a statement:

"Congratulations to the Orlando Police Department for finally figuring out what we long warned -- Amazon's surveillance technology doesn't work and is a threat to our privacy and civil liberties. This failed pilot program demonstrates precisely why surveillance decisions should be made by the public through their elected leaders, and not by corporations secretly lobbying police officials to deploy dangerous systems against the public."