Apple previews iOS 16 with new accessibility features

The news: Ahead of WWDC in just a few weeks, Apple previewed new accessibility features launching with iOS 16 and in other software updates later this year.

Here are the rundown of the new accessibility features:

  • Door detection: This helps users locate doors and surrounding attributes, including any signs, whether it is open or closed and how far they are from one
  • Apple Watch Mirroring: AirPlay your Apple Watch display to your iPhone in order to use iOS accessibility features like Voice Control or Switch Control, and even use external Made for iPhone switches
  • Quick Actions on Apple Watch: “a double-pinch gesture can answer or end a phone call, dismiss a notification, take a photo, play or pause media in the Now Playing app, and start, pause, or resume a workout”
  • Live Captions on iPhone, iPad and Mac: Live transcription of spoken words comes to media apps, FaceTime calls and more
  • VoiceOver gains support for over 20 new locales and languages, like Bengali, Bulgarian, Catalan, Ukrainian, and Vietnamese
  • Buddy Controller: someone can help you play video games by linking 2 controllers to a single input (this is super cool)
  • Siri Pause time: “users with speech disabilities can adjust how long Siri waits before responding to a request”
  • Voice Control Spelling Mode: vocalize specific spellings with per-letter input
  • Sound Recognition expands to detect a “unique alarm, doorbell, or appliances”
  • Apple Books brings new themes and custom options such as word spacing for improved accessibility

Sam’s take: I do not see any other tech company in the world taking accessibility as seriously as Apple. These features seem to be incredible leaps forward for users who have accessibility needs…and it’s also consequently our first look at iOS 16. Why am I so certain of this? Apple did the same thing ahead of iOS 15 with a preview of new accessibility features last year.

Apple announces new accessibility features coming in future software updates

The news: To celebrate Global Accessibility Awareness Day, Apple has announced a slew of new accessibility features said to be rolling out in future software updates later this year.

  • AssistiveTouch for Apple Watch: “Using built-in motion sensors like the gyroscope and accelerometer, along with the optical heart rate sensor and on-device machine learning, Apple Watch can detect subtle differences in muscle movement and tendon activity, which lets users navigate a cursor on the display through a series of hand gestures, like a pinch or a clench.”
  • Eye-Tracking Support for iPad allows you to use third-party eye-tracking devices to control your device
  • Explore Images with VoiceOver: “Users can now explore even more details about the people, text, table data, and other objects within images. Users can navigate a photo of a receipt like a table: by row and column, complete with table headers.”
  • Made for iPhone Hearing Aids and Audiogram Support: “In a significant update to the MFi hearing devices program, Apple is adding support for new bi-directional hearing aids…Apple is also bringing support for recognizing audiograms — charts that show the results of a hearing test — to Headphone Accommodations.”
  • New Background Sounds “help minimize distractions and help users focus, stay calm, or rest. Balanced, bright, or dark noise, as well as ocean, rain, or stream sounds continuously play in the background to mask unwanted environmental or external noise…”
  • Other features coming: Sound Actions for Switch Control (replaces physical buttons and switches with mouth sounds), per-app display and text size settings and new Memoji customizations (oxygen tubes, cochlear implants and a soft helmet)

Our take: This is a pretty sweeping set of announcements, but I’m most impressed by AssistiveTouch on Apple Watch. That looks like a monumental step forward for individuals who are motion impaired and could be a game-changer. Plus, eye-tracking on iPad? I never thought we’d see something like that.