iPhones will soon speak in your voice with short training, Apple unveils more accessibility features

Apple has announced new features for users with cognitive, vision, and hearing disabilities as a part of its Global Accessibility Awareness campaign. Some of the key features coming to iPhones include “Assistive Access”, “Personal Voice”, and “Point and Speak in Magnifier.” For select regions, Apple is also rolling out additional software features, curated collections, and more, though the company ensure new tools draw on advances in hardware and software, including on-device machine learning to ensure user privacy. 

Perhaps the most significant feature is the Personal Voice Advance Speech for users at risk of losing their ability to speak — such as those with a recent diagnosis of ALS (amyotrophic lateral sclerosis) or other conditions. The tool aims to let users speak in their own voice via the iPhone. In a blog post, Apple explains:

“Users can create a Personal Voice by reading along with a randomized set of text prompts to record 15 minutes of audio on iPhone or iPad. This speech accessibility feature uses on-device machine learning to keep users’ information private and secure, and integrates seamlessly with Live Speech so users can speak with their Personal Voice when connecting with loved ones.”

Apart from Personal Voice, Apple is adding Live Speech on iPhone, iPad, and Mac to let users speak with a speech disability. Users can type what they want to say to have it spoken out loud during phone and FaceTime calls as well as in-person conversations.

Assistive Access is designed for users with cognitive disabilities. The tool offers a custom app experience by removing the excess to help users select the option more relevant to them. For example, for users who prefer communicating visually, Messages includes an emoji-only keyboard and the option to record a video message to share with loved ones. Users and trusted supporters can also choose between a more visual, grid-based layout for their Home Screen and apps or a row-based layout for users who prefer text.

In simple words, Assistive Access on iPhones and iPads offers a simple interface with high-contrast buttons and large text labels. For iPhones with LiDAR Scanner, there’s going to be a new Point and Speak in Magnifier to let users with disabilities interact with physical objects. Apple states that Point and Speak combines input from the camera, the LiDAR Scanner, and on-device machine learning to announce the text on each button as users move their fingers across the keypad.

Other than the new tools, Apple will roll out SignTime in Germany, Italy, Spain, and South Korea on May 18 to connect Apple Store and Apple Support customers with on-demand sign language interpreters. Select Apple Store locations around the world are offering informative sessions throughout the week to help customers discover accessibility features.

Leave a Reply

Your email address will not be published. Required fields are marked *