ZB ZB
Opinion
Live now
Start time
Playing for
End time
Listen live
Listen to NAME OF STATION
Up next
Listen live on
ZB

Apple previews eye-tracking for iPhone, iPad, anti-car-sickness dots

Author
Chris Keall,
Publish Date
Thu, 16 May 2024, 2:50pm
Coming later this year, Apple’s new accessibility features include Eye Tracking, a way for users to navigate iPad and iPhone with just their eyes. Photo / Apple video still
Coming later this year, Apple’s new accessibility features include Eye Tracking, a way for users to navigate iPad and iPhone with just their eyes. Photo / Apple video still

Apple previews eye-tracking for iPhone, iPad, anti-car-sickness dots

Author
Chris Keall,
Publish Date
Thu, 16 May 2024, 2:50pm

Apple has marked Global Accessibility Awareness Day by releasing a slew of new accessibility features - including a new AI-powered feature that lets people with disabilities navigate an iPhone or iPad using just their eyes.

Release will be later this year, likely with iOS and iPadOS 18.

Eye Tracking will use a device’s front-facing camera to follow a person’s gaze. “Dwell Control” or pausing their gaze on an icon or other element will be the equivalent of physically selecting it before accessing additional functions such as physical buttons, swipes and other gestures solely with their eyes.

All of the eye-tracking AI and machine learning will be done on your device, with no data shared with Apple or any third-party.

And the tech should work with iOS, Apple apps and third-party software.

Another pending feature will help anyone who suffers car sickness when using an iPhone or iPad in a moving vehicle (as a passenger, naturally).

“Vehicle Motion Cues” involves dots that appear on the edge of your iPhone or iPad’s screen when your device detects you’re in a moving vehicle.

Using sensors built into your device, the dots move to match the motion of the vehicle - helping to reduce the sometimes nausea-inducing mismatch between a static screen and a moving car.

The dots will work with third-party apps or if you’re watching a movie.

(If it’s not your bag, Motion Cues can be disabled in Control Centre.)

And following on from the voice-cloning feature showcased last year - aimed at people at risk of losing the ability to speak due - there is a range of new speech features on the way, again all AI-powered.

They include Vocal Shortcuts, where a word - or jus an utterance - can be recognised by Siri to trigger an action such as calling a friend or family member.

Setup will be as simple as uttering a sound three times.

Listen for Atypical Speech will use on-device machine-learning to recognise user speech patterns. Designed for users with acquired or progressive conditions that affect speech, such as cerebral palsy, amyotrophic lateral sclerosis (ALS) or stroke, these enhanced speech recognition features will provide a new level of customisation and control, according to an Apple preview.

Pending accessibility updates to CarPlay include Sound Recognition, which allows drivers or passengers who are deaf or hard of hearing to turn on alerts to be notified of car horns and sirens.
Pending accessibility updates to CarPlay include Sound Recognition, which allows drivers or passengers who are deaf or hard of hearing to turn on alerts to be notified of car horns and sirens.

A new CarPlay-based audio feature was also previewed.

Using your iPhone or iPad’s microphone, Sound Recognition feature will listen for the likes of an ambulance siren or an impatient driver honking behind you - then display a text message on your car’s screen.

Music Haptics will allow the deaf or hard-of-hearing to experience music through taps and vibrations. It will work with Apple Music, with an API available for other music services to enable it on their platforms.

visionOS will offer Live Captions, so users who are deaf or hard of hearing can follow along with spoken dialogue in live conversations and in audio from apps.
visionOS will offer Live Captions, so users who are deaf or hard of hearing can follow along with spoken dialogue in live conversations and in audio from apps.

And Apple’s Vision Pro (still only available in the US, though with an increasing number of DIY imports) will new accessibility features including visionOS - which will offer Live Captions, so users who are deaf or hard of hearing can follow along with spoken dialogue in live conversations and in audio from apps.

Other new features will include Hover Typing, which shows larger text when typing in a text field, in the user’s preferred font and colour, and broader language support for Personal Voice - the aforementioned feature that allows someone to clone their voice before they lose the ability to speak, allowing for more natural sounding synthetic speech.

“Imagine how differently we would remember Stephen Hawking if he had been able to communicate with the world using a sample of his real voice, rather than something synthetic. Our voices are unique and so personal to us,” says accessibility advocate Jonathan Mosen as the feature debuted with iOS 17.

Chris Keall is an Auckland-based member of the Herald’s business team. He joined the Herald in 2018 and is the technology editor and a senior business writer.

Take your Radio, Podcasts and Music with you