Automated tools play an important role in web accessibility testing. When used properly, they can help expedite the accessibility testing process and streamline accessible web development overall.
However, automated testing tools fall short of fully evaluating all of the criteria set by the Web Content Accessibility Guidelines (WCAG). Many of these accessibility requirements need more thoughtful consideration and context, which makes them impossible to identify by automation alone. In these cases, manual assessment and interpretation are necessary.
In this article, you’ll learn about where automated tools fit into the website accessibility testing workflow, as well as what types of criteria are best left for manual testing.
What are the benefits of automated web accessibility testing?
Despite their limitations, automated tools are still a crucial part of the accessibility testing process. In fact, one study by Deque found that their automated tool was able to detect, on average, around 57% of the accessibility errors on a website.
While the effectiveness of automated testing will vary from website to website depending on the unique considerations of each, it’s still the best way to kick-start an accessibility audit or to use periodically during web development to identify errors. Automated testing often excels at uncovering the “quick wins”—errors that are more straightforward than those found during manual testing.
Automated accessibility tests can also be integrated into website build processes so that they will run with little to no extra effort as changes are made. This type of automation can help serve as a fallback to minimize the number of errors that make it to a live website.
Furthermore, automated testing tools are typically not overly complex, so they can be a great way for less technical team members to help with web accessibility testing.
Common criteria that automated accessibility tests can help evaluate
Automated accessibility tests can assist in assessing standard benchmarks, such as:
- Color contrast against solid backgrounds
- Form elements
- Image alt text
- Invalid code or ARIA markup
Examples of automated accessibility testing tools
If you’re looking for an automated testing tool to add to your workflow, consider some of these popular options:
When is manual web accessibility testing better than automated testing?
Website accessibility issues are often nuanced and open to interpretation. This is both the challenging part of accessibility testing and why automated tools can only get you so far. Manual testing is needed for many of the WCAG criteria to properly assess the level of compliance and to determine the best solution to resolve errors.
But you don’t have to be an accessibility expert to help with manual testing. While you likely will need the help of an experienced accessibility professional at some point, even those less tech-savvy can help test for accessibility issues without relying on automated tools.
Here are some examples of the accessibility issues manual testing can help you uncover that automated tools overlook. While this isn’t an exhaustive list, it can serve as a start to understanding the distinctions between manual and automated testing.
One area of web accessibility testing is ensuring keyboard-only users can successfully navigate your website. According to web.dev, “about 25% of all digital accessibility issues are related to a lack of keyboard support”.
Luckily, with such universal use of keyboards, this is one of the most straightforward manual tests to conduct. Manual keyboard testing can help to uncover errors like:
- Poor navigation. Keyboard-only users should be able to easily navigate through website pages, content, and links without barriers.
- Focus errors. Any elements receiving keyboard focus should have proper indicators and the visual order of elements should match the order in the website code.
- Inaccessible content. All content that is accessible through use of a mouse should also be accessible to keyboard users, such as content that displays on hover.
While automated tools can uncover some errors related to images and videos, manual testing is required to further assess criteria such as:
- Color contrast. While automated tools can evaluate the color contrast of many elements, manual testing is typically needed to assess text contrast against any background images.
- Image alt text. Automated tools can tell you whether or not alt text is present, but it requires manual interpretation to determine if the alt text is implemented properly according to best practices.
- Video alternatives. Most videos should have an appropriate text alternative, such as captions or a transcript.
When it comes to web accessibility testing, content is often the area most open to interpretation and is almost exclusively up to manual evaluation. Here are some areas of content accessibility that automated testing tools can’t uncover:
- Content comprehension. Is the content easy to understand? Is a web page titled appropriately based on the content it contains?
- Link usage. Do links have descriptive text as opposed to “click here” or “learn more”? Are they pointing to the expected URL?
- Heading usage. Are headings on the page in the proper hierarchy? Do they accurately describe the content?
- Zoom. When zoomed in, is the content still readable? Can you still navigate through the website properly?
Forms are another area where manual web accessibility testing is needed to confirm any automated testing results. A manual check is necessary to confirm things like:
- Field labels are visible and descriptive
- Placeholder or help text is used properly
- Keyboard users can successfully complete the form
In addition to keyboard usage, there are other assistive technologies that people with disabilities might be using to navigate your website. Manual web accessibility testing can help uncover some of the issues assistive devices might encounter.
For example, one of the most commonly referred to assistive devices in accessibility testing is a screen reader. There are free screen readers available that you can utilize in your accessibility testing process, such as VoiceOver for Mac or NVDA for Windows.
While engaging actual users with disabilities is always preferable when possible for an accurate understanding of how they will navigate your website, getting more familiar with these types of tools during manual testing will offer valuable insight not found with automated tools.
How do I improve accessibility testing for my website?
Now that you better understand the differences between automated accessibility testing tools and manual web accessibility testing, you might be wondering about how best to integrate these options into your web development process.
Here are some parting tips:
- Test often during the initial development of a website. Using both automated and manual testing to uncover issues before content goes live is the most timely and cost-efficient way to resolve accessibility errors.
- For live websites, make accessibility testing part of the process for publishing new content or launching new features.
- If you have a large website, it might be unrealistic to evaluate every page. In these cases, consider testing a few pages of each template type, top-level pages, or pages with special functionality as a priority.
- Make web accessibility testing everyone’s responsibility. From designers to developers to content editors—anyone who plays a role in your website should have at least a general understanding of web accessibility and how to double-check their work for accessibility issues.