Apple Highlights New Accessibility Arriving Later This Year
With WWDC 2024 less than a month away, Apple has just previewed a number of new accessibility features arriving later this year for the iPhone, iPad, and Apple Vision Pro.
The first new feature, Eye Tracking, uses artificial intelligence. It allows users with physical disabilities to control and navigate an iPhone or iPad with just their eyes. It uses the front-facing camera to set up and calibrate. All of the information is stored on the device and requires no additional hardware.
Music Haptics will be compatible with Apple Music and with third-party apps using an API. The feature, when turned on, will uses the Taptic Engine in the iPhone to play taps, textures, and refined vibrations of the audio music. It’s made for users who are deaf or hard of hearing to experience the music.
Vocal Shortcuts will allow iPhone and iPad users to assign custom utterances, understandable by Siri, to launch Shortcuts and complete specific tasks. The feature is for users with conditions that affect speech like cerebral palsy or ALS.
To help reduce motion sickness, Vehicle Motion Cues will show animated dots on the edges of the screen to represent sensor conflict without interesting with the main content on-screen.
Three new features for accessibility are also coming to CarPlay—Voice Control, color filters, and Sound Recognition. Voice Control allows CarPlay users to navigate and control apps with a voice command. Sound Recognition alerts drivers or passengers who are deaf when a horn or siren is activated. Color Filters makes the interface easier to user with color blindness.
Finally, accessibility options like Live Captions in FaceTime is scheduled to arrive on visionOS and the Apple Vision Pro.