Modern technology gives us many things.

Apple’s new accessibility features include Door Detection and Apple Watch Mirroring

Apple has unveiled accessibility features that will give people with disabilities new ways to get around, connect and communicate with their Apple products.

These new software features combine the power of the hardware with machine learning to deliver provide new customisable tools that build on Apple’s long-standing commitment to empower all customers with their products.

The new features include Door Detection, Apple Watch mirroring, Voice and Switch Control and Live Captions.

Apple is also adding 20 more languages to its already popular Voice Over feature.

DOOR DETECTION

Door Detection is for users who are blind or low vision and uses LiDAR, the camera and on-device machine learning to help locate a door when they arrive at a new location, how far they are from it and whether it’s open or closed.

The feature can also tell the user whether the door is open or closed, if it needs to be pushed or pulled to open or whether they need to turn a knob.

Door Detection can also read signs that are near the door like a room number or an accessible entrance symbol.

Door Detection will work alongside existing accessibility features like People Detection and Image Descriptions.

APPLE WATCH MIRRORING

Apple Watch has become an even more valuable accessibility tool for people with physical and motor disabilities with Apple Watch Mirroring.

This allows the user to control Apple Watch remotely from their paired iPhone.

This means users can control their Apple Watch using assistive features like Voice Control and Switch Control and use voice commands, sound actions, head tracking or external Made for iPhone switches instead of tapping the Apple Watch display.

Users can do a lot more with Apple Watch through simple hand gestures.

The new Quick Actions on Apple Watch means the user can clench their fist or pinch their fingers instead of touching the display to triggers actions like taking a photo or playing or pausing content.

LIVE CAPTIONS
Deaf and hard-of-hearing users will now be able to access Live Captions on iPhone, iPad and Mac.

Whether you are on a call, a FaceTime chat, streaming content or just talking to someone next to you, Live Captions can automatically transcribe the voices to text in real time.

Users also have the option of typing a response and having it spoken out loud for them.

ADDITIONAL FEATURES
– With Buddy Controller, users can ask a care provider or friend to help them play a game; Buddy Controller combines any two game controllers into one, so multiple controllers can drive the input for a single player.

– With Siri Pause Time, users with speech disabilities can adjust how long Siri waits before responding to a request.

– Voice Control Spelling Mode gives users the option to dictate custom spellings using letter-by-letter input.5

– Sound Recognition can be customised to recognise sounds that are specific to a person’s environment, like their home’s unique alarm, doorbell or appliances.

– The Apple Books app will offer new themes and introduce customisation options such as bolding text and adjusting line, character and word spacing for an even more accessible reading experience.