Microsoft is emphasizing developments and collaborations in its assistive products at the 14th Ability Summit, focusing on Azure AI.
This includes recent features like AI-powered audio descriptions and the Azure AI studio, enhancing accessibility for developers with disabilities. Updates to the Seeing AI tool include additional languages and more detailed AI-generated descriptions, along with playbooks offering guidelines for accessible campuses and improved mental health support.
A forthcoming feature called “Speak For Me” is previewed, set to launch later this year. Similar to Apple’s Personal Voice, it aids individuals with ALS and speech disabilities by using custom neural voices. Microsoft has collaborated with non-profit ALS organization Team Gleason on this project and is committed to ensuring the technology is used for positive purposes.
Accessibility updates include Copilot gaining new skills to launch Live Caption and Narrator, and the availability of the Accessibility Assistant feature for M365 apps like Word, with plans to expand to Outlook and PowerPoint. Microsoft is also releasing new playbooks, including a Mental Health toolkit in partnership with Mental Health America.
Jenny Lay-Flurrie, Microsoft’s chief accessibility officer, highlights the importance of responsible AI in building accessible products. While generative AI presents opportunities for increased productivity and technology use for people with disabilities, Lay-Flurrie emphasizes the need to remain principled and thoughtful in adopting new technology trends.
The summit features guest speakers like actor Michelle Williams and Microsoft employee Katy Jo Wright, addressing mental health and chronic Lyme disease, respectively. Additionally, Amsterdam’s Rijksmuseum will showcase how it utilized Azure AI’s computer vision and generative AI to provide image descriptions for over a million pieces of art, benefiting visitors who are blind or have low vision.