Why AI-based Accessibility testing falls short and the alternatives

Experience Dynamics
5 min readFeb 28, 2022

by Frank Spillers
CEO/CXO @ Experience Dynamics; UXInnerCircle.com

persons arms show typing at computer code

Summary: Automation of accessibility audits, while a great idea, is better used as an adjunct tool over a replacement for best practice accessibility testing. Chiefly, it does not credibly detect accessibility issues. Instead, use AI-based accessibility checkers after you have done a thorough job of addressing issues, including testing with users with disabilities (aka “Holistic testing”), or if you frequently change your design (weekly/monthly).

Essential distinctions to begin with

When we use the term “accessibility testing,” we need to be clear about what we mean. Most of the time, accessibility testing refers to a method that does not involve users. Wait, testing that does not include users? That should get your alarm bells ringing. All of user experience work, including accessibility, is about centering problem-solving and validation of assumptions of use around user behavior. Why would users with disabilities be any different?

Let’s define three terms methods that roll under ‘accessibility testing’:

  1. Tool-based Accessibility testing. A ‘checker tool’ such as Wave, Axe, or another inspection service checks your site against accessibility guidelines.
An accessibility “checker tool” reviews a site against WCAG guidelines.

2. Accessibility testing with users with disabilities (aka “Holistic testing”). Users with disabilities go through your site, app, or product using their native Assistive Technology (AT), such as a screen reader. Note: No longer a ‘nice to have, the new WCAG 3.0 standard requires testing with users with disabilities.

Blind user encounters a non-accessible PDF. None of the images are tagged for screen reader access.

3. Expert testing with Assistive technology (AT). An accessibility expert or specialist evaluates your product against standards and known issues. This person is experienced in accessibility work and has a legal disability.

Blind specialist evaluates accessibility discovering a design flaw overlooked in the design. Instead of presenting an answer choice to a quiz, that can be visually matched to A, B, or C, the system should announce “B is correct” since blind users can not make the visual leap as sighted users can.

It is essential to clarify which type of ‘accessibility testing’ you mean whenever you use the term ‘accessibility testing.’ Naming the type of testing you mean will keep you and your audience honest.

Which of these is better? All three are required, which we have found over 20 years doing accessibility work at UX consultancy Experience Dynamics. Accessibility is nuanced by disability type and AT version and can trigger bugs with existing platforms or code that is theoretically compliant but fails when tested. So how about automated accessibility testing?

Enter ‘accessibility testing’ by an algorithm

AI-based accessibility testing. A new generation of AI-based services will, for $9.99–$99.00 a month, send an automated algorithm to scan your site to identify accessibility issues.

Almost all of these new services use a subscription model (monthly checking). The price point is tempting. Many organizations, including non-profits and others new to digital accessibility, feel this makes for a “low barrier” to comply with accessibility or Inclusive Design efforts. The problem with the subscription model is that constant checking is not entirely necessary. It’s a business model, not based on accessibility compliance requirements.

When constant checking makes sense: a) If your site is large, you are constantly changing the design (adding new pages/layouts/ content), or b) You are reckless (i.e., You lack knowledge of accessibility practices, don’t care to learn simple corrective measures like ALT tagging, or you have so many chefs in the kitchen you need some oversight).

AI-based or automated accessibility can provide a false sense of security or worse position accessibility as checked off the list. Instead, accessibility needs to be ongoing as part of design, content, and developer responsibility. An algorithm can only help you “clean up” your act for so long before you confront the need to adjust your product development process to include accessibility best practices. In short, you are committing to accessibility as part of your Inclusive Design strategy and disability advocacy efforts.

How AI-based accessibility testing measures up

Recent reports estimate automated testing tools have a 20–30% coverage rate of issues found. Compare this to 57% of issues found by a widely-used (credible) checker tool (Deque report 2021). Note that 57%, while better than 30%, is still short of the whole picture. As the comparative image of accessibility methods below shows, combining or triangulating methods is advised.

image of accessibility methods compared shows automated checkers having 20–30% efficacy vs manual checker tools at 30–57% and testing with users with disabilities at 60–80% vs guidelines along at 15–50%
Why taking a triangulated approach to accessibility evaluation is recommended.

Triangulate your approach with an emphasis on testing users with disabilities

At Experience Dynamics, we use a three-pronged approach to assessing and optimizing accessibility: a review of WCAG guidelines alongside checker tools and testing with users with disabilities.

A positive development coming in WCAG 3.0 (the proposed Web Content Accessibility Guidelines from W3) will require testing with users with disabilities. The guidelines contrast user or ‘Holistic testing’ with ‘Atomic testing’ (manual or automated testing). Yes, the naming is a little ‘out there, but it makes sense given the ambiguity of how people talk about ‘accessibility testing’ without clarifying which kind, as noted above.

The Atomic testing guidelines clarify the role of automated testing; you won’t hear from vendors selling accessibility evaluation software:

Per WCAG 3.0: “Atomic tests may be automated or manual. Automated evaluation can be completed without human assistance. These tests allow for a larger scope to be tested but *automated evaluation alone cannot determine accessibility*. Over time, the number of accessibility tests that can be automated is increasing, but manual testing is still required to evaluate most methods at this time.”

In summary, it is irresponsible to use the term ‘accessibility testing’ carte blanche. Instead, we need to clarify what we mean and understand that accessibility work requires an overlap of tools and techniques to gain accurate insights. Since most accessibility or ‘Inclusive Design’ efforts focus on automation (AI bots) or checker tools, more of a focus is needed on involving users with disabilities in testing. Automated testing can play a role but it is not a complete solution. Worse it can give you the idea you have accessibility covered when you don’t.

What won’t work? Continuing to see Accessibility as a developer technical task. Thinking you can “check off” or tackle accessibility with a cost-effective and shiny new AI-based accessibility audit tool.

Frank Spillers CEO Experience Dynamics smiling

About the author: Frank Spillers, CEO/ CXO of Experience Dynamics, a leading UX consulting firm with Fortune 500 clients worldwide.

For over 20 years, Frank has been a seasoned UX consultant, Researcher, Designer, and Trainer. He is an award-winning expert in improving the design of digital products, services, and experiences. Frank is a Subject Matter Expert in UX Design, UX Management, Accessibility, Emotion Design, Service Design, Localization UX, Lean UX, VR/ AR UX Design. He provides private corporate training and offers courses to the largest online design organization in the world (Interaction Design Foundation). In 2001, Frank founded UX consulting firm Experience Dynamics. He provides deep learning opportunities at UX Inner Circle.

--

--

Experience Dynamics

Experience Dynamics, a leading user experience design consultancy that helps high growth companies manage and win with UX/UI excellence.