Apple’s new accessibility features include eye tracking and aid for motion sickness

Apple has found a way to help its users with motion sickness on the road in its latest accessibility-focused software updates.

The company presented several accessibility features for its upcoming iOS 18 and iPadOS 18 releases, scheduled for later this year. The new features should make it easier for people with motion sickness, physical disabilities, and hearing or speech problems to use various Apple products.

Eye Tracking

Among Apple’s newly presented software features, this one is dedicated to people with physical disabilities. With Eye Tracking, they can navigate through the device and activate chosen elements thanks to “Dwell Control.”

The feature works by using the front-facing camera to set up and calibrate in seconds. “With on-device machine learning, all data used to set up and control this feature is kept securely on the device,” explained Apple.

The feature is available on iPadOS and iOS apps.

Vehicle Motion Cues

This new feature, which will be available for iPhone and iPad, is focused on reducing motion sickness while traveling in a vehicle. Motion sickness usually appears due to ”sensory conflict between what a person sees and what they feel, which can prevent some users from comfortably using iPhone or iPad while riding in a moving vehicle.”

Having this in mind, Apple created “animated dots” that appear on the edges of a device, indicating motion changes in a vehicle. These dots should ease the “sensory conflict” while a user is engaged with content on the device.

Apple Motion Cues feature
Apple's Motion Cues to reduce motion sickness. Image by Apple

Music Haptics

Another inclusive feature presented by Apple is for users who have a hearing disability. This feature works through the Taptic Engine, which “plays taps, textures, and refined vibrations to the audio of the music.”

Music Haptics is available on millions of songs found on Apple Music. It will also work as an API so developers can use the feature to make music accessible in their apps.

Vocal Shortcuts

This feature, which allows users to “assign custom utterances that Siri can understand to launch shortcuts and complete complex tasks,” was created for people whose speech is affected by various diseases, including cerebral palsy, stroke, or amyotrophic lateral sclerosis (ALS).

With Vocal Shortcuts, Apple also introduced another feature, Listen for Atypical Speech, which uses “on-device machine learning to recognize user speech patterns.” This way, those affected by speech-impaired diseases can more easily control their devices.

New features for CarPlay

New accessibility-based features will also be available on CarPlay. These include Voice Control, allowing users to control apps and navigate CarPlay using their voice.

For users who have hearing disabilities, Apple introduced Sound Recognition, which sends alerts about car horns or sirens.

With Color Filters, colorblind people will be able to “make the CarPlay interface visually easier to use, with additional visual accessibility features including Bold Text and Large Text.”

Accessibility features for visionOS

The main accessibility feature for visionOS is Live Captions – a tool that helps to “follow along with spoken dialogue in live conversations and in audio from apps.”

Live Captions will also be available on FaceTime.

The company also added that “updates for vision accessibility will include the addition of Reduce Transparency, Smart Invert, and Dim Flashing Lights for users who have low vision, or those who want to avoid bright lights and frequent flashing.”

More from Cybernews:

Singing River ransomware impact larger than initially thought

MIT brothers arrested for $25M crypto Ethereum blockchain heist

Musk's Neuralink issues with tiny wires for years, sources say

TikTok creators file suit against US gov over divest-or-ban bill

BreachForums seized by the FBI, again

Subscribe to our newsletter

Leave a Reply

Your email address will not be published. Required fields are markedmarked