Apple introduces innovative accessibility software features
Apple has previewed new software features for cognitive, vision, hearing and mobility accessibility, along with innovative tools for nonspeaking individuals or at risk of losing their ability to speak.
In a long-standing commitment to making products for everyone, Apple has been working in deep collaboration with community groups representing a broad spectrum of users with disabilities to develop impactful accessibility features.
Coming later this year, users with cognitive disabilities can use iPhone and iPad with greater ease and independence with Assistive Access; nonspeaking individuals can type to speak during calls and conversations with Live Speech; and those at risk of losing their ability to speak can use Personal Voice to create a synthesised voice that sounds like them to connect with family and friends.
For users who are blind or have low vision, Detection Mode in Magnifier offers Point and Speak, which identifies text users point towards and reads it out loud to help them interact with physical objects such as household appliances.
These updates draw on advances in hardware and software, including on-device machine learning to ensure user privacy.
Tim Cook, CEO, Apple: says, “At Apple, we’ve always believed that the best technology is a technology built for everyone.”
“Today, we’re excited to share incredible new features that build on our long history of making technology accessible so that everyone has the opportunity to create, communicate and do what they love.”
Sarah Herrlinger, Senior Director of Global Accessibility Policy and Initiatives Apple, says that accessibility is part of everything they do at Apple.
“These groundbreaking features were designed with feedback from members of disability communities every step of the way to support a diverse set of users and help people connect in new ways,” says Herrlinger.
Assistive Access lightens the cognitive load by refining apps to their essential features. Through feedback from its users, activities foundational to the iPhone and iPad will be customised accordingly: connecting with loved ones, capturing and enjoying photos, and listening to music.
Assistive Access also includes a customised experience for Phone and FaceTime, which have been combined into a single Calls app, as well as Messages, Cameras, Photos and Music.
Users can experience a distinct interface with high-contrast buttons, large text labels, and tools to help trusted supporters tailor the experience for the individual they support.
Katy Schmid, senior director of National Program Initiatives at The Arc of the United States, says the recent innovations in accessible technology are much needed.
“The intellectual and developmental disability community is bursting with creativity, but technology often poses physical, visual or knowledge barriers for these individuals,” says Schmid.
“To have a feature that provides a cognitively accessible experience on iPhone or iPad means more open doors to education, employment, safety and autonomy.
“It means broadening worlds and expanding potential,” Scmid explains.
With Live Speech, users can type what they want to say to have it be spoken out loud during phone and FaceTime calls, as well as in-person conversations. They can also save common phrases and use them quickly during lively discussions.
For users at risk of losing their ability to speak, Personal Voice can be used to create a voice that sounds like them. Users can record a personalised voice by reading along with a randomised set of text prompts, then able to integrate this with Live speech to connect with friends and family.
Phillip Green, board member and ALS advocate at the Team Gleason nonprofit, who has experienced significant changes to his voice since receiving his ALS diagnosis in 2018, says: “At the end of the day, the most important thing is being able to communicate with friends and family,”
“If you can tell them you love them in a voice that sounds like you, it makes all the difference in the world, and being able to create your synthetic voice on your iPhone in just 15 minutes is extraordinary,” says Green.
Point and Speak in Magnifier makes it easier for users with vision disabilities to interact with physical objects with several text labels. Point and Speak combine input from the Camera app, the LiDAR Scanner and on-device machine learning to announce the text on each button as users move their finger across the keypad.
Point also works with VoiceOver and can be used with other Magnifier features such as People Detection, Door Detection and Image Descriptions to help users navigate their physical environment.
To celebrate Global Accessibility Awareness Day Around the World, Apple is also introducing new features and curated collections this week.
This includes SignTime, launching in Germany, Italy, Spain and South Korea, to connect Apple Store and Apple Support customers with on-demand sign language interpreters.
There will also be informative sessions in Apple stores throughout the week to help customers discover accessibility features. Furthermore, Apple Carnegie Library will feature a Today at Apple session with sign language performer and interpreter Justina Miles.