Thursday is Global Accessibility Awareness Day, and to mark the occasion, Apple has previewed several new accessibility features coming to its OSes later this year. Although this accessibility preview has become an annual affair, this year’s preview is more packed than most years, with a wide variety of features for navigating UIs, automating tasks, interacting with Siri and CarPlay, enabling live captions in visionOS, and more. Apple hasn’t announced when these features will debut, but if past years are any indication, most should be released in the fall as part of the annual OS release cycle.
Eye Tracking
Often, Apple’s work in one area lends itself to new accessibility features in another. With Eye Tracking in iOS and iPadOS, the connection to the company’s work on visionOS is clear. The feature will allow users to look at UI elements on the iPhone and iPad, and the front-facing camera – combined with a machine learning model – will follow their gaze, moving the selection as what they look at changes. No additional hardware is necessary.
Eye Tracking also works with Dwell, meaning that when a user pauses their gaze on an interface element, it will be clicked. The feature, which requires a one-time calibration setup process, will work with Apple’s apps, as well as third-party apps, on iPhones and iPads with an A12 Bionic chip or newer.
Vocal Shortcuts
Vocal Shortcuts provide a way to define custom utterances that launch shortcuts and other tasks. The phrases are defined on-device for maximum privacy using a process similar to Personal Voice. The feature is like triggering shortcuts with Siri, but it doesn’t require an assistant trigger word or phrase.