Latest in Tomorrow

Image credit:

Amazon's facial-analysis tool showed gender and race bias, says study

The software thought darker-skinned women were men almost a third of the time.
Kris Holt, @krisholt
January 25, 2019
3 Shares
Share
Tweet
Share

Sponsored Links

Donat Sorokin via Getty Images

Research suggests Amazon's facial analysis algorithms have struggled with gender and racial bias. The MIT Media Lab found Rekognition had no trouble at all in correctly pinpointing the gender of lighter-skinned men but it classified women as men almost a fifth of the time and darker-skinned women as men on almost one out of three occasions. IBM and Microsoft software performed better than Amazon's tool -- Microsoft's solution mistakenly thought darker-skinned women were men 1.5 percent of the time.

However, Amazon has disputed the results of MIT's tests, which took place over the course of 2018. It argued the researchers hadn't used the current version of Rekognition, and said the gender identification test used facial analysis (which picks out faces from images and assigns generic attributes to them), rather than facial recognition, which looks for a match for a specific face. It noted that they are distinct software packages.

"Using an up-to-date version of Amazon Rekognition with similar data downloaded from parliamentary websites and the Megaface dataset of [1 million] images, we found exactly zero false positive matches with the recommended 99 [percent] confidence threshold," Matt Wood, general manager of deep learning and AI at Amazon Web Services, told VentureBeat.

It's not the first time such software has been under fire. MIT and Stanford researchers said last February three facial analysis programs exhibited similar gender and skin-color bias. Since then, IBM has released a dataset it says should improve accuracy in facial analysis tools, while Microsoft has called for more regulation of the tech in order to maintain high standards.

Meanwhile, in November, a group of lawmakers asked Amazon for answers over its decision to supply Rekognition to law enforcement, after deeming the company's response to an earlier letter to be insufficient. Amazon has also pitched Rekognition to Immigration and Customs Enforcement (ICE), while some shareholders have asked the company to stop selling the tech -- they're concerned it may violate people's civil rights.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.
Comment
Comments
Share
3 Shares
Share
Tweet
Share

Popular on Engadget

Engadget's 2020 Back-to-School Guide

Engadget's 2020 Back-to-School Guide

View
A $13,000 electric car will go on sale in the US by late 2020

A $13,000 electric car will go on sale in the US by late 2020

View
Tesla is reportedly close to making a more affordable Model Y

Tesla is reportedly close to making a more affordable Model Y

View
Pixel 4a review: The best $350 phone

Pixel 4a review: The best $350 phone

View
Sony explains how PS4 accessories will work on PS5

Sony explains how PS4 accessories will work on PS5

View

From around the web

Page 1Page 1ear iconeye iconFill 23text filevr