microsfot

Latest

  • Donat Sorokin via Getty Images

    Amazon's facial-analysis tool showed gender and race bias, says study

    by 
    Kris Holt
    Kris Holt
    01.25.2019

    Research suggests Amazon's facial analysis algorithms have struggled with gender and racial bias. The MIT Media Lab found Rekognition had no trouble at all in correctly pinpointing the gender of lighter-skinned men but it classified women as men almost a fifth of the time and darker-skinned women as men on almost one out of three occasions. IBM and Microsoft software performed better than Amazon's tool -- Microsoft's solution mistakenly thought darker-skinned women were men 1.5 percent of the time.