The essence of accessible color contrast is simple. Given a foreground color and a background color, the contrast between those two must be distinguishable in a wide variety of environments, by individuals with different color perception abilities. Using the Web Content Accessibility Guidelines – WCAG — version 2.0, these contrasts are measured using an algorithm that compares the relative luminosity of the two colors and returns a ratio, which is to exceed WCAG’s recommended minimum.But the reality of color contrast is more complicated. There are a lot of assumptions to work out before you can be confident that visually impaired shoppers can use your ecommerce site.
What Are ‘Foreground’ and ‘Background’ Colors?
…This clearly applies when text needs to be distinguished from the background color. White text on a white background may as well be absent entirely. But it also applies to neighboring text, such as a link within a paragraph. If that link looks the same as the text surrounding it, there will be no way of identifying visually which text is linked.
If your links are underlined, this becomes a non-issue.
What Two Colors Are We Comparing?
The simple case of text in a single color and a background in another is easy. When assessing contrast, automated tools will reliably identify the two different colors.
But that is not true with other common cases:
Text with text-shadow;
Transparency in one or more colors.
Adding a thin black outline with a minimum width of one pixel is a WCAG recommendation to maintain a contrast ratio between the letter and its background. While this doesn’t directly correlate to text shadow, it’s reasonable to extrapolate that adding a text shadow to darken the boundary of your text allows you to use it for measuring contrast, rather than depending on the background color.
Apple briefly mentioned some accessibility enhancements during its press event this week — watchOS 3 is adding wheelchair specific optimizations to Apple Watch — but iOS 10, macOS Sierra, tvOS 10 and watchOS 3 also have many other improvements to assist users with motor, vision, hearing, and learning impairments. Here’s the rundown…
Microsoft announces upcoming accessibility enhancements for Office 365 Android and iOS apps and adds several new data transformation options to Excel.
Building on Office’s existing accessibility options for visually impaired users, Microsoft revealed that it plans to add new capabilities.”
I am excited to announce that Office 365 teams are not only working on enhancing the usability of VoiceOver with Office 365 iOS apps and Narrator with Windows 10 Mobile apps, but also the usability of TalkBack with Office 365 Android apps,” stated John Jendrezak, accessibility lead and partner director of program management for the Microsoft Office Engineering group, in a recent blog post.
VoiceOver and TalkBack are text-to-speech technologies that help users navigate their software audibly.
Google’s Latest Accessibility Feature Is So Good, Everyone Will Use It
Though it was developed for users with severe motor impairment, Voice Access could revolutionize how anyone uses their phone.
Announced this week at I/O 2016 as something that will ship with Android N, Voice Access is a way for people with severe motor impairment to control every aspect of their phones using their voices. But once you see it in action, the broader impact of Voice Access is immediately obvious.
Here’s how it works. When Voice Access is installed, you can enable it with Android’s “Okay Google” command by just saying: “Okay Google, turn on Voice Access.” Once it’s on, it’s always listening—and you don’t have to use the Okay Google command anymore. With Voice Access, all of the UI elements that are normally tap targets are overlaid by a series of numbers. You can tell Voice Access to “tap” these targets by saying the corresponding number aloud.
But these numbers are actually meant to serve as a backup method of control: You can also just tell Voice Assistant what you want to do. For example, you could ask it to “open camera,” and then tell it to “tap shutter.” Best of all? Any app should work with Voice Access, as long as it’s already following Google’s accessibility guidelines.
Technically, Voice Access builds upon two things that Google’s been laying the groundwork on for a while now. The first is natural language processing, which allows Google Assistant to understand your voice.
This blog [post] by Marian Foley is the first in a series of blog posts about people with access needs. The aim of the series is to raise awareness of the different ways people access websites, common issues faced and what designers and developers can do to remove the issues.
Marian Foley, content designer and particular needs IT user spoke to us about the problems she faces, and her solutions. Most importantly, she answers the question – how can we make the web more accessible?
What should content designers and developers be doing?
The most obvious thing for me is to use Responsive Web Design (RWD). This solves the problem of websites not fitting on my screen and I can access the same options as everyone else. Since RWD became mainstream around 2012/13 I’ve been able to use the mobile versions of most websites (including GOV.UK and the backend of GOV.UK). I’m a big fan!
Design accessible websites by:
1 making your layout clear and simple
2 having menus at the top of the page, on the left if possible, so that people using a low resolution find them quickly
3 using .png files for diagrams because they’re transparent; someone using their own colour scheme will see their colour preference as the background colour
4 providing text alongside icons and images to explain what’s going on
5 publishing HTML pages, not .pdf files, because they’re accessible to more users
6 taking alternative text attributes off diagrams and putting them on the page; people who don’t use screen readers but can’t read the text in your image won’t miss out (use “” in the alt text field because you’ll no longer need any)
Google’s Eve Andersson tells Co.Design how today’s accessibility problems could lead to improvements in robots, Google Maps, and even YouTube.
TEACHING AIS HOW TO NOTICE, NOT JUST SEE
Like Microsoft, which recently announced a computer vision-based accessibility project called Seeing AI, Google’s interested in how to convey visual information to blind users through computer vision and natural language processing. And like Microsoft, Google is dealing with the same problems: How do you communicate that information without just reading out loud an endless stream-of-conscious list of what a computer sees around itself—regardless of how trivial they may or may not be?
Thanks to Knowledge Graph and machine learning—the same principles that Google uses to let you search photos by content (like photos of dogs, or photos of people hugging)—Andersson tells me that Google is already good enough at identifying objects to decode them from a video stream in real time. So a blind user wearing a Google Glass-like wearable, or a body cam hooked up to a smartphone, could get real-world updates on what can be seen around them.
But again, the big accessibility problem that needs to be solved here is one of priority.
Much has been made recently of Google’s advances in natural language processing, or Google’s ability to understand and transcribe human speech. Google’s accessibility efforts lean heavily upon natural language processing, particularly its latest innovation, Voice Access. But Andersson says computers need to understand more than just speech. Forget natural language processing: computers need non-language processing.
TAKING NAVIGATION BEYOND GOOGLE MAPS
Sighted users are so used to taking directions from computers that many people (like me) can barely find their way around without first plugging an address into Waze. But moving sighted individuals from point A to point B, across well-plotted roads and highways, is navigation on macro scale. Things get much more complicated when you’re trying to direct a blind person down a busy city street, or from one store to another inside a shopping mall. Now, you’re directing people on a macro scale, but in an environment that is not as well understood or documented as roads are.
For athletes who use a wheelchair, and everyday wheelchair users looking to track their exercise and calories burned, Apple has good news for you. Apple Watch will include manual wheelchair fitness tracking in its free watchOS 3.0 update, to be released later this year.
Apple made the announcement at their annual WWDC (Worldwide Developer Conference) on June 13, 2016. In watchOS 3, the Activity app will offer a setting for wheelchair users. Wheelchair pushes contribute to all-day calorie goals, the “time to stand” reminder becomes “time to roll,” and dedicated wheelchair-specific workouts are available.With this update, Apple Watch will become the first fitness tracking device for wheelchair users. “We want to make products that serve every walk of life,” Apple’s chief operating officer, Jeff Williams, said in an interview. “We realized that while it was great for messages on the wrist, we wanted to offer [people with disabilities] the same opportunity to get healthier using Apple Watch.”
Are online-only businesses like Uber and Airbnb covered by Title III of the ADA, and what would coverage mean when the businesses don’t own or operate the vehicles or accommodations that customers use?
Title III of the ADA only applies to owners, operators, lessors, and lessees of “place[s] of public accommodations.” Businesses such as Uber and Airbnb do not fit neatly fit into this definition because, as web-only businesses, they are not actual “places” of public accommodation. Moreover, they don’t own, operate, or the goods or services – the vehicles or accommodations – used by the end customer.