Advertisement

Microsoft’s neural voice tool for people with speech disabilities arrives later this year

At the Microsoft Ability summit today, the company is continuing to raise awareness about inclusive design.

Microsoft (Screenshot)

At its 14th Ability summit, which kicks off today, Microsoft is highlighting developments and collaborations across its portfolio of assistive products. Much of that is around Azure AI, including features announced yesterday like AI-powered audio descriptions and the Azure AI studio that better enables developers with disabilities to create machine-learning applications. It also showed off new updates like more languages and richer AI-generated descriptions for its Seeing AI tool, as well as new playbooks offering guidelines for best practices in areas like building accessible campuses and greater mental health support.

The company is also previewing a feature called “Speak For Me,” which is coming later this year. Much like Apple’s Personal Voice, Speak For Me can help those with ALS and other speech disabilities to use custom neural voices to communicate. Work on this project has been ongoing “for some time” with partners like the non-profit ALS organization Team Gleason, and Microsoft said it’s “committed to making sure this technology is used for good and plan to launch later in the year.” The company also shared that it’s working with Answer ALS and ALS Therapy Development Institute (TDI) to “almost double the clinical and genomic data available for research.”

One of the most significant accessibility updates coming this month is that Copilot will have new accessibility skills that enable users to ask the assistant to launch Live Caption and Narrator, among other assistive tools. The Accessibility Assistant feature announced last year will be available today in the Insider preview for M365 apps like Word, with the company saying it will be coming “soon” to Outlook and PowerPoint. Microsoft is also publishing four new playbooks today, including a Mental Health toolkit, which covers “tips for product makers to build experiences that support mental health conditions, created in partnership [with] Mental Health America.”

Ahead of the summit, the company’s chief accessibility officer Jenny Lay-Flurrie spoke with Engadget to share greater insight around the news as well as her thoughts on generative AI’s role in building assistive products.

“In many ways, AI isn’t new,” she said, adding “this chapter is new.” Generative AI may be all the rage right now, but Lay-Flurrie believes that the core principle her team relies on hasn’t changed. “Responsible AI is accessible AI,” she said.

Still, generative AI could bring many benefits. “This chapter, though, does unlock some potential opportunities for the accessibility industry and people with disabilities to be able to be more productive and to use technology to power their day,” she said. She highlighted a survey the company did with the neurodiverse community around Microsoft 365 Copilot, and the response of the few hundred people who responded was “this is reducing time for me to create content and it’s shortening that gap between thought and action,” Lay-Flurrie said.

The idea of being responsible in embracing new technology trends when designing for accessibility isn’t far from Lay-Flurrie’s mind. “We still need to be very principled, thoughtful and if we hold back, it’s to make sure that we are protecting those fundamental rights of accessibility.”

Elsewhere at the summit, Microsoft is featuring guest speakers like actor Michelle Williams and its own employee Katy Jo Wright, discussing mental health and their experience living with chronic Lyme disease respectively. We will also see Amsterdam’s Rijksmusem share how it used Azure AI’s computer vision and generative AI to provide image descriptions for over a million pieces of art for visitors who are blind or have low vision.

This article contains affiliate links; if you click such a link and make a purchase, we may earn a commission.