The news: To celebrate Global Accessibility Awareness Day, Apple has announced a slew of new accessibility features said to be rolling out in future software updates later this year.
- AssistiveTouch for Apple Watch: “Using built-in motion sensors like the gyroscope and accelerometer, along with the optical heart rate sensor and on-device machine learning, Apple Watch can detect subtle differences in muscle movement and tendon activity, which lets users navigate a cursor on the display through a series of hand gestures, like a pinch or a clench.”
- Eye-Tracking Support for iPad allows you to use third-party eye-tracking devices to control your device
- Explore Images with VoiceOver: “Users can now explore even more details about the people, text, table data, and other objects within images. Users can navigate a photo of a receipt like a table: by row and column, complete with table headers.”
- Made for iPhone Hearing Aids and Audiogram Support: “In a significant update to the MFi hearing devices program, Apple is adding support for new bi-directional hearing aids…Apple is also bringing support for recognizing audiograms — charts that show the results of a hearing test — to Headphone Accommodations.”
- New Background Sounds “help minimize distractions and help users focus, stay calm, or rest. Balanced, bright, or dark noise, as well as ocean, rain, or stream sounds continuously play in the background to mask unwanted environmental or external noise…”
- Other features coming: Sound Actions for Switch Control (replaces physical buttons and switches with mouth sounds), per-app display and text size settings and new Memoji customizations (oxygen tubes, cochlear implants and a soft helmet)
Our take: This is a pretty sweeping set of announcements, but I’m most impressed by AssistiveTouch on Apple Watch. That looks like a monumental step forward for individuals who are motion impaired and could be a game-changer. Plus, eye-tracking on iPad? I never thought we’d see something like that.