Apple has today announced various new accessibility features coming later this year to the iPhone and iPad as part of iOS 18 and iPadOS 18. The new capabilities include an AI-powered feature called Eye Tracking, allowing users to control their devices using just their eyes.
Eye Tracking will be available on the iPhone and iPad later this year, allowing users with motor impairment to navigate their devices using just their eyes which are tracked by the front-facing camera. Apple says Eye Tracking can be set up in a few seconds and is processed completely on-device thanks to artificial intelligence.
Eye Tracking works system-wide across all iOS and iPadOS apps and requires no additional hardware. Dwell Control is used with Eye Tracking to trigger an app’s buttons, swipes, other gestures, and elements.
Discover new horizons, always connected with eSIM
Travel the world stress and hassle-free with the best eSIM service available. Enjoy unlimited data, 5G speeds, and global coverage for affordable prices with Holafly. And, enjoy an exclusive 5% discount.
We believe deeply in the transformative power of innovation to enrich lives. That’s why for nearly 40 years, Apple has championed inclusive design by embedding accessibility at the core of our hardware and software. We’re continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users.
Apple CEO Tim Cook
Another accessibility feature Apple is introducing is Music Haptics, which uses the iPhone’s Taptic Engine to mimic “taps, textures, and refined vibrations to the audio of the music.” Music Haptics will work with millions of songs on Apple Music, and the company says an API will allow developers to enable the feature on third-party music streaming apps.
Vocal Shortcuts is a new accessibility feature Apple plans to launch with iOS 18. With Vocal Shortcuts, users will be able to set up certain phrases to trigger shortcuts and other complex tasks. Vocal Shortcuts is joined by Listen for Atypical Speech, which recognizes user speech patterns and enhanced speech recognition for users with acquired or progressive conditions that can affect speech patterns.
Apple Vision Pro will gain new accessibility features with the launch of visionOS 2. Live Captions will make their way to the headset, allowing users who are hard of hearing to see real-time captions of conversations with people around them or from spoken dialogue in FaceTime with Personas and other apps.
Apple is also bringing forward many other accessibility improvements across the board, including:
- Voice Control, Color Filters, and Sound Recognition on CarPlay
- Vehicle Motion Cues to reduce motion sickness for passengers in moving vehicles
- A new Reader Mode for the Magnifier app
- Use a display region as a Virtual Trackpad as part of AssistiveTouch