Apple today made a series of hardware announcements.
Understandably, the announcement that has caused the most social media chatter in the blind community relates to the iPhone X, and it’s new Face ID feature.
Apple has earned our trust over the years by ensuring that its products are fully accessible from their initial launch, so few observers were in any doubt that Apple would have given thought to the accessibility of this new feature. However, were there limitations of the technology that simply made it a non-starter for some people?
I wrote to Apple, and quickly received a response to some of my initial questions.
My questions stem from the fact that I am congenitally blind. My particular eye condition causes my eyes to look small and a little sunken, and they are often closed. Further, I have a form of congenital cataracts. I was curious to know whether Face ID would work for someone like me and others I know with prosthetic eyes, given that during the keynote, Apple indicated that the iPhone X would not unlock unless you gave the phone your attention.
Apple says the following.
The iPhone X has been designed with a number of accessibility features to support its use.
For VoiceOver users, Face ID will prompt you as to how to move your head during set up in order to complete a scan. If you do not want Face ID to require attention, you can open Settings > General > Accessibility, and disable Require Attention for Face ID. This is automatically disabled if you enable VoiceOver during initial set up.
‘Aipoly Vision’ is a very useful object-and-colour recogniser app that helps the blind, vision-impaired, and colour blind to understand their surroundings. It does so by using artificial intelligence to recognise objects through a device’s camera and then announces the name of each object to the user.
The Aipoly(link is external) developers are on a self-declared mission to build scalable vision intelligence. They intend to add facial recognition to the Aipoly Vision app, whereby users will be able to enter the names of people visible in the camera frame for ongoing recognition. They have also indicated that the app will soon be able to be taught new objects. When pointed at an object which is not recognised, users will be able to enter the name of the object which will be remembered the next time that object is encountered.
This app is an excellent example of how emerging technology can make a positive difference to users right now, and it comes with an ‘intelligent torch’ feature which automatically turns on the device’s torch if the camera frame is too dark, allowing the app to work in low-light situations.
Curated by lifekludger
Read full article at source