Apple Announces SignTime Service & Powerful Accessibility Features for People With Disabilities: Report
To further help people with mobility, vision, hearing and cognitive disabilities, Apple has announced powerful accessibility software updates across iOS, watchOS and iPadOS.
Cupertino: To further help people with mobility, vision, hearing and cognitive disabilities, Apple has announced powerful accessibility software updates across iOS, watchOS and iPadOS, including a new SignTime service to connect Apple Store and Apple Support customers with on-demand sign language interpreters. To begin with, customers visiting Apple Store locations in the US, the UK and France can use SignTime to remotely access a sign language interpreter right in their web browsers. Apple Watch Series 7 Likely To Feature Flat-Edged Design: Report.
Later this year, people with limb differences will be able to navigate Apple Watch using AssistiveTouch. iPad will also support third-party eye-tracking hardware for easier control and for blind and low-vision communities and Apple's ‘VoiceOver' screen reader will get even smarter using on-device intelligence to explore objects within images, the company said in a statement late on Wednesday.
"With these new features, we're pushing the boundaries of innovation with next-generation technologies that bring the fun and function of Apple technology to even more people — and we can't wait to share them with our users," said Sarah Herrlinger, Apple's senior director of Global Accessibility Policy and Initiatives.
In support of neurodiversity, Apple is introducing new background sounds to help minimise distractions, and for those who are deaf or hard of hearing and Made for iPhone (MFi) will soon support new bi-directional hearing aids. AssistiveTouch for watchOS allows users with upper body limb differences to enjoy the benefits of Apple Watch without ever having to touch the display or controls.
"iPadOS will support third-party eye-tracking devices, making it possible for people to control iPad using just their eyes," Apple said. Later this year, compatible MFi devices will track where a person is looking onscreen and the pointer will move to follow the person's gaze, while extended eye contact performs an action, like a tap, the company added.
Using built-in motion sensors like the gyroscope and accelerometer, along with the optical heart rate sensor and on-device machine learning, Apple Watch can detect subtle differences in muscle movement and tendon activity, which lets users navigate a cursor on the display through a series of hand gestures, like a pinch or a clench.
(The above story first appeared on LatestLY on May 20, 2021 11:34 AM IST. For more news and updates on politics, world, sports, entertainment and lifestyle, log on to our website latestly.com).