Apple on Tuesday highlighted a handful of accessibility features that will be available on various iPhone, iPad and Apple Watch models later this year. The tech giant said the new features aim to help people with disabilities “navigate, connect, and get the most out of Apple products.”
The features that Apple previewed include Door Detection, Apple Watch Mirroring, Live Captions and expanded VoiceOver.
Door Detection will let people who are blind or low vision use an iPhone or iPad to locate a door when arriving at a new place and find out how far they are from the door. The feature, which uses lidar, can also describe door attributes, such as whether it’s open or closed and what type of handle it has, and read information on and around the door like a room number. The feature will be available in Magnifier, Apple’s app to support people who are blind or have low vision, and will work on the iPhone 13 Pro, iPhone 13 Pro Max, iPhone 12 Pro, iPhone 12 Pro Max and newer iPad Pro models.
Apple Watch Mirroring aims to help people with physical and motor disabilities use the smartwatch’s features, like its blood oxygen sensor and heart rate monitoring, without having to to tap on its small display. Apple said Mirroring pairs with an iPhone and lets people use “assistive features like Voice Control and Switch Control, and use inputs including voice commands, sound actions, head tracking, or external Made for iPhone switches.”
In addition to Door Detection and Mirroring, Apple said it will bring Live Captions to the iPhone, iPad and Mac to help people follow along with audio and video on FaceTime, video conferencing apps, streaming media and more. It’ll be available in English in beta later this year on the iPhone 11 or later, iPads with A12 Bionic and Macs with Apple Silicon. Apple will also update its VoiceOver screen reader to support over 20 more languages, including Bengali, Bulgarian, Catalan, Ukrainian and Vietnamese.
The announcement comes ahead of Global Accessibility Awareness Day on Thursday.