Door detection will use the lidar scanner and machine learning to identify doors and relay information about their location, labeling, and more to blind or low-vision users. [credit:
Global Accessibility Awareness Day is Thursday, so Apple took to its newsroom blog this week to announce several major new accessibility features headed to the iPhone, Apple Watch, iPad, and Mac.
One of the most widely used will likely be Live Captions, which is coming to iPhone, Mac, and iPad. The feature shows AI-driven, live-updating subtitles for speech coming from any audio source on the phone, whether the user is “on a phone or FaceTime call, using a video conferencing or social media app, streaming media content, or having a conversation with someone next to them.”
The text (which users can resize at will) appears at the top of the screen and ticks along as the subject speaks. Additionally, Mac users will be able to type responses and have them read aloud to others on the call. Live Captions will enter public beta on supported devices (“iPhone 11 and later, iPad models with A12 Bionic and later, and Macs with Apple silicon”) later this year.
Powered by WPeMatico