Research suggests Amazon's facial analysis algorithms have struggled with gender and racial bias. The MIT Media Lab found Rekognition had no trouble at all in correctly pinpointing the gender of lighter-skinned men but it classified women as men almost a fifth of the time and darker-skinned women as men on almost one out of three occasions. IBM and Microsoft software performed better than Amazon's tool -- Microsoft's solution mistakenly thought darker-skinned women were men 1.5 percent of the time.
However, Amazon has disputed the results of MIT's tests, which took place over the course of 2018. It argued the researchers hadn't used the current version of Rekognition, and said the gender identification test used facial analysis (which picks out faces from images and assigns generic attributes to them), rather than facial recognition, which looks for a match for a specific face. It noted that they are distinct software packages.
"Using an up-to-date version of Amazon Rekognition with similar data downloaded from parliamentary websites and the Megaface dataset of [1 million] images, we found exactly zero false positive matches with the recommended 99 [percent] confidence threshold," Matt Wood, general manager of deep learning and AI at Amazon Web Services, told VentureBeat.
It's not the first time such software has been under fire. MIT and Stanford researchers said last February three facial analysis programs exhibited similar gender and skin-color bias. Since then, IBM has released a dataset it says should improve accuracy in facial analysis tools, while Microsoft has called for more regulation of the tech in order to maintain high standards.
Meanwhile, in November, a group of lawmakers asked Amazon for answers over its decision to supply Rekognition to law enforcement, after deeming the company's response to an earlier letter to be insufficient. Amazon has also pitched Rekognition to Immigration and Customs Enforcement (ICE), while some shareholders have asked the company to stop selling the tech -- they're concerned it may violate people's civil rights.