Human rights organizations ask Zoom to scrap its emotion tracking AI in open letter

Fight for the Future and the other participants are hoping the letter can pressure Zoom to change its mind.

10'000 Hours via Getty Images

Digital rights non-profit Fight for the Future and 27 human rights organizations have written an open letter to Zoom, asking the company not to continue exploring the use of AI that can analyze emotions in its video conferencing platform. The groups wrote the letter in response to a Protocol report that said Zoom is actively researching how to incorporate emotion AI into its product in the future. It's part of a larger piece examining how companies have started using artificial intelligence to detect the emotional state of a potential client during sales calls.

The pandemic made video conferences a lot more common around the world. Sales people have been finding it hard to gauge how receptive potential clients are to their products and services, though, without the capability to read their body language through the screen. Companies have started using technology that have the ability to analyze people's moods during calls as a result, and Protocol said Zoom has plans to provide the same service.

Fight for the Future and the other human rights orgs are hoping their call would pressure Zoom to abandon its plans. They called the technology "discriminatory, manipulative, potentially dangerous and based on assumptions that all people use the same facial expressions, voice patterns, and body language."

The groups also pointed out that the technology is inherently biased and racist, just like facial recognition. By incorporating the feature, Zoom would be discriminating against certain ethnicities and people with disabilities, they said. In addition, it could be used to punish students or workers if they displayed the wrong emotion. In 2021, a project led by University of Cambridge professor Alexa Hagerty showed the limits of emotion recognition AIs and how easy it is to fool them. Previous studies also showed that emotion recognition programs fail the racial bias test and struggle to read Black faces.

The group ended the letter by mentioning Zoom's decision to cancel the rollout of face-tracking features and calling this another opportunity to do what's right by its users. They're now asking Zoom to commit to not implementing emotion AI in its product by May 20th, 2022.