Advertisement

Microsoft will phase out facial recognition AI that could detect emotions

The move comes as Microsoft pushes for more responsible uses of AI.

Microsoft will phase out facial recognition AI that could detect emotions

Microsoft is keenly aware of the mounting backlash toward facial recognition, and it's shuttering a significant project in response. The company has revealed it will "retire" facial recognition technology that it said could infer emotions as well as characteristics like age, gender and hair. The AI raised privacy questions, Microsoft said, and offering a framework created the potential for discrimination and other abuses. There was also no clear consensus on the definition of emotions, and no way to create a generalized link between expressions and emotions.

New users of Microsoft's Face programming framework no longer have access to these attribute detection features. Current customers can use them until June 30th, 2023. Microsoft will still fold the tech into "controlled" accessibility tools, such as its Seeing AI for people with vision issues.

The exit comes as Microsoft has shared its Responsible AI Standard framework with the public for the first time. The guidelines illustrate the tech firm's decision-making process, including a focus on principles like inclusion, privacy and transparency. This also represents the first big update to the standard since it was introduced in late 2019, and promises more fairness in speech-to-text tech, stricter controls for neural voice and "fit for purpose" requirements that rule out the emotion-detecting system.

Microsoft isn't the first company to have second thoughts about facial recognition. IBM stopped work in that field over worries its projects could be used for human rights abuses. With that said, this is still a major change of heart. One of the world's largest cloud and computing companies is backing away from AI that could have a substantial impact.

This article contains affiliate links; if you click such a link and make a purchase, we may earn a commission.