Insights

The Power (and Limits) of Automated Accessibility Testing

Blog Image Automation
6.1.2017
Blog Image Automation

By Dan Holbrook

Senior Quality Assurance Engineer

Automation is a powerful tool for accessibility. We use it in almost all aspects of our accessibility work at The Nerdery, from estimation to design to development to quality assurance.

If you're new to accessibility, I recommend checking out two browser plugins: Color Contrast Analyzer for quick visual checks, and aXe for more detailed accessibility code checks. We use these tools almost every day in combination with our own custom accessibility script. Here's an example of a typical report from our scripting tests:

An accessibility report with Nerdery, aXe, Image and SVG test results.Automation saves us time: it can test websites that would take us weeks to test manually in a matter of hours, it can find issues early on that we might otherwise not discover until much later and it can find issues that we might not find at all without it. I love automation.

Bart Simpson hugging a robot with the caption "When the automation is ready to go".

Image from Frinkiac.com

If you're not new to accessibility, you probably predicted that a shoe was about to drop. Too often, we see clients come to us after having previously relied on automated tests to ensure their website's accessibility.

The best news that we can give those clients is that sometimes the automation is reporting non-issues. I have tested Nerdery.com in five different screen readers and dozens of browser and device combinations, and all the evidence I have so far says that the issues aXe reports don't affect actual users. ALL of the popular automation tools we've tested report false positives or overestimate the severity of issues. Here is an example of our review of reported failures in some automation we were interested in, again using our own site as the target:

A table with columns for aXe, the Nerdery script, tickets, and QA notes. Some issues are not replicable or are improvements not failures

More often a client comes to us and it isn't just an interpretation issue: we have to tell them that the automation isn't finding everything.

For WCAG 2.0 AA conformance testing (the generally-accepted goal for accessibility compliance) automated tests represent only 10 percent of our test plan, 6 percent when you factor in tests across multiple browsers or screen readers.

Pie chart of WCAG compliance testing showing 10 percent automated coverage.

Automation is a good way to tell if "something" is wrong, but it won't find all the compliance issues on a site, especially a complex one. There are only two WCAG 2.0 guidelines that we rely on automation alone to test: Page Titled and Parsing. All other guidelines may require manual review depending on the setup of your site.

Screenshot of Nerdery accessibility test plan showing example WCAG guidelines that require manual testing

Keep in mind that we WANT to use automation to test — it saves us time, it lets us test other things, we love automation here — but combining multiple automated tools has been able to take us only this far.

We start with automation, see what it flags and try to save some time that way. Then we manually review to see what automated issues were valid and what the automation missed.

aXe advises the user to manually test after a successful automation pass

What does automation miss? It's terrible at simulating keyboard and screen reader navigation, which may mean that a site passes automation but can't be used at all by someone with motor or visual disabilities. (We see that a lot.) Automation is terrible at telling what you can see when the site is zoomed. It can't tell where you need headers, or if a link has a clear enough description, or if you're moving around a navigation element that needs to be in a predictable place. Automation can't tell if you're relying on color to indicate something a color blind user won't be able to see. It's terrible with error messages, because it isn't entering anything into your forms. It won't tell you if your captcha makes your form impossible for a screen reader user to submit. Automation probably won't bother testing your site in a mobile view.

Automation should be your first attack. It's ours. But it's not enough. Don't trust anyone who tries to convince you it is. Double check the site with a free screen reader like VoiceOver or NVDA, tab through it with your keyboard, zoom in and see what happens. It’s the users of your site, not the robots, that need accessibility.

We’re available to help, too. Want to learn more about our approach to accessibility? Contact us.

Published on 06.01.17