Modern technology gives us many things.

Apple’s new accessibility features include Live Speech which reads out what you type

Apple has outlined a number of new iPhone and iPad software features for cognitive, vision, hearing and mobility accessibility which are expected to be part of iOS 17 and iPadOS 17 updates.

Apple has always had a deep involvement with community groups as they develop features that can make a big difference in people’s lives – in particular those suffering a disability.

Later this year users with cognitive disabilities will be able to use their iPhone and iPad easier with Assistive Access while non-speaking customers will be able to type to speak during calls with Live Speech.

Users who are blind and with low vision the new Detection Mode in Magnifier will include Point and Speak which reads out text simply by pointing at it so they can use physical objects like household appliances.

“At Apple, we’ve always believed that the best technology is technology built for everyone,” said Tim Cook, Apple’s CEO.

“Today, we’re excited to share incredible new features that build on our long history of making technology accessible, so that everyone has the opportunity to create, communicate and do what they love.”

Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives said: “Accessibility is part of everything we do at Apple. These groundbreaking features were designed with feedback from members of disability communities every step of the way, to support a diverse set of users and help people connect in new ways.”

ASSISTIVE ACCESS

Assistive Access provides a distinct interface with high contrast buttons and large text labels.

Users can opt for a visual, grid-based layout for their home screen and apps or, if they prefer, a row-based layout with text.

“The intellectual and developmental disability community is bursting with creativity, but technology often poses physical, visual or knowledge barriers for these individuals,” said Katy Schmid, senior director of National Program Initiatives at The Arc of the United States.

“To have a feature that provides a cognitively accessible experience on iPhone or iPad — that means more open doors to education, employment, safety and autonomy. It means broadening worlds and expanding potential.”

LIVE SPEECH – PERSONAL VOICE ADVANCE SPEECH ACCESSIBILITY

Live Speech lets you type what you want to say on an iPhone, iPad or Mac and have it read out loud for phone and FaceTime calls.

It’s also possible to save often-used phrases to be able to instantly chime in during conversations.

Those who could possibly lose their ability to speak can create a Personal Voice that will speak on their behalf – but in their voice.

To create a Personal Voice, a user can read a set of prompts to build up 15 minutes of audio on their iPhone or iPad and then integrate with Live Speech with their Personal Voice.

“At the end of the day, the most important thing is being able to communicate with friends and family,” said Philip Green, board member and ALS advocate at the Team Gleason nonprofit, who has experienced significant changes to his voice since receiving his ALS diagnosis in 2018.

“If you can tell them you love them, in a voice that sounds like you, it makes all the difference in the world — and being able to create your synthetic voice on your iPhone in just 15 minutes is extraordinary.”

DETECTION MODE IN MAGNIFIER

Point and Speak in Magnifier is for users with vision disabilities who want to interact with physical objects like appliances and other devices.

Using the microwave as an example, the new feature allows users to use the camera and the LIDAR scanner to announce the text on each button as they move their finger near it.

This also works with other magnifier features including People Detection, Door Detection and Image Descriptions to help visually impaired users navigate their surroundings.

Additional Features

– Deaf or hard-of-hearing users can pair Made for iPhone hearing devices directly to Mac and customise them for their hearing comfort.3

– Voice Control adds phonetic suggestions for text editing so users who type with their voice can choose the right word out of several that might sound alike, like ‘do’, ‘due’ and ‘dew’.4 Additionally, with Voice Control Guide, users can learn tips and tricks about using voice commands as an alternative to touch and typing across iPhone, iPad and Mac.

– Users with physical and motor disabilities who use Switch Control can turn any switch into a virtual game controller to play their favourite games on iPhone and iPad.

– For users with low vision, Text Size is now easier to adjust across Mac apps such as Finder, Messages, Mail, Calendar and Notes.

– Users who are sensitive to rapid animations can automatically pause images with moving elements, such as GIFs, in Messages and Safari.

– For VoiceOver users, Siri voices sound natural and expressive even at high rates of speech feedback; users can also customise the rate at which Siri speaks to them, with options ranging from 0.8x to 2x.