Simple Accessibility Tooling

By

Simplicity: “Worse is Better”

My approach to software acceptance has a history: “worse is better”, a term from a 1989 essay by Richard P. Gabriel (designer of Common Lisp). In essence, a simpler tool may be more readily adopted and improved upon than a complex tool being adopted and learned out of the gate.

In the influential essay it was argued that “worse” solutions could have better survival characteristics than the more complete solution that may be more complex for users to accept initially. The “worse” solution may in fact be better eventually because it could be improved upon until it was nearly as complete as the more “ideal” solution to start with.

It is more important that people adopt and stick with a tool than having the most complete solution to start with.

My main points are:

  1. A simpler accessibility test and result will more likely be accepted as valid.

  2. People either need to run the test themselves or experience you running it as much as possible. They need to push a button, or see a test result that they themselves visited. This promotes buy-in.

  3. It may be better to run simple or incomplete tests to increase the level of buy-in to the issue. Overly complex or imposing results regularly are ignored or turn people off, unless their job role strongly requires they take notice.

Acceptance of a test is just the beginning. Once primed or warmed to tests, the person or organization may be ready for more advanced test results.

Summary: Try to keep it simple

Thought needs to go into your product or service that has a website rather than just making accessibility tests pass. Sometimes this requires getting as simple as possible in terms of communication.

Sometimes a test or demonstration can dramatically increase interest in the topic in someone who may be adjacent or close to the work but not directly assigned it as their job. This article is also about trying to get buy-in with tools, as simply as possible.

I suggest that giving the smallest and simplest test to a stakeholder or coworker directly to run themselves can be very effective at increasing motivation and focus on the topic. However, giving an overly complex or detailed test or report can potentially feel like a missing opportunity.

Give a person a fish and you feed them for a day; teach a person to fish and you feed them for a lifetime

Ideally your coworker or teammate of any kind will be able to understand something new about web accessibility, or at least you have a better chance, considering some of this advice.

A note on automation: not yet substituted for many things

A lot cannot currently be automated or is only partially automated and may never will be. For this reason it can require team and organizational buy-in to help get things moving in a better direction, to understand more the benefits and issues related to web accessibility.

Usability testing that includes people who are skilled at manual testing and ideally those with accessibility needs remains as relevant as ever: tools are not a replacement for these practices.

At the end of the day, getting people to care a little more is always key to getting movement.

Web Accessibility Tools: Manual/Automatic in different ways

“Manual” tools requiring front end interaction with browsers come in a variety of forms such as these examples discussed (not an exhaustive list, just some examples for illustration of their simplicity):

Automated tooling examples that don’t necessarily rely on direct input from a user include:

  • Command Line tools to test websites - Pa11y (does more than just that but also that)
  • E2E Testing frameworks - Axe-core integrated with Cypress or Playwright

(Again, just a few examples, for illustration).

There are a lot of good tools lists, here are three:

Tools that are not advised/be wary:

Freemium Scanning Services: There are also a variety of ‘free’ or freemium services offering scanning services, such as the ability to enter a simple URL into a web app service and then have it scan a website for issues and produce a report for you. As I will discuss, these kinds of website services have their own issues and are not the approach I recommend for some cases and usually not a good fit for a first introduction.

Note, one generally good example of a good scanner is WAVE which seems like a very good tool. However, as I will discuss, its still sometimes not as ideal as a more focused and simplified form of a test result.

Accesibility Overlay tools: These have a variety of issues and potential dangers. See Overlay Fact Sheet.

What is important is that you, and whoever you get to participate in a test, actually understand mostly what is happening. You are less likely to get that from a freemium scan or an overlay type tool. For this exercise, we are focusing on the manually operated browser tools including simple Web Apps, Browser Extensions and Bookmarklets.

Some of the things Ive seen

Problem: There can be a lot of what I call “Drive by scanning”. This is when someone runs a scan, sometimes on a service providing “free” scans and dumps a result on someone saying there are issues. In this scenario you get all the burden but do not necessarily learn anything, and usually the proposed solution by the platform involves… paying for their own services. This method causes friction with sometimes legal inferences and claims made by the sources linked: its not a good reception for learning.

Suggestion: Some work needs to be done by the tester/stakeholder. Can’t be deferred to a 3rd party or completely automated tool and then dropped on someone. Also, there is a related issue with people not understanding the tool well enough: rarely does anyone fully understand the tests under the hood being done by scanning services. For this reason, we want to avoid using tools like freemium scanner services as an entry point to introducing new knowledge in some contexts.

Problem: Complexity of Report findings. There are a lot of acronyms and assumed knowledge many practitioners forget are not commonly held. Even saying or reading WCAG… you want that to come a little later.

Suggestion: As little work needs to be done as possible to test and receive readable results. It needs to be something that they can clearly understand.

Problem: “Jankiness” of experience, real or perceived of some tools, such as false positives or new interfaces to understand to interpret them.

Suggestion: Keep the bookmarklets for the dev-friendlier people, extensions for a wider audience, and wherever possible the simplest web apps that require minimal user input.

Solutions in the space to increase attention

Many have emphasized demonstration and thought provoking experiences to increase buy-in. This includes asking all kinds of people to consider accessibility themselves outside their usual sphere:

  • Use a keyboard to navigate the website, can you do it? This important for many people of many abilities
  • Can you read it without your glasses or something similar to consider non visual content presentation? Can you read it while its sunny in the garden outside? Worth a try
  • Can you zoom/scale on a phone? Browsers mostly all support this now natively

We are not trying to emulate having an accessibility need but to encourage thinking about it generally. This encourages buy-in to giving increased attention to the issue and support for its development and attention.

Where possible show, not tell

There are grades of complexity from simple to more complex solutions. Ideally you can break down whatever it is you are seeking to communicate into a standalone, specific demonstrative example.

Web App (like contrast checker) with a warm introduction: minimize the noise, simplify it right down, perhaps the simplest way, by entering values into a URL, or permalinking a test result. The web continues to provide some of the fewest barriers to entry possible.

Contrast.report and WebAIM Contrast Checker - both of these earn my highest grade of respect for their simple design and power. There are other types of web app tests like it but this is first in its class. WAVE performs the similar test but across a whole page: for our purposes we specifically want the smaller more specific test app such as the following permalink.

If you have a contrast error, even if you find it in an error report from a more advanced app, enter the two values into the Contrast Report or WebAIM contrast checker. Confirm the colors cause the contrast error with this tool. Then, permalink the combination that is an issue, share it with an introduction to the color contrast issue.

In effect, what you are doing is removing all the noise of the test and focusing in on a concrete value of what is being discussed so that there is less to consider that new or potentially any cause of confusion or question. If sighted they can see the text on the page and maybe think “yeah this does look fuzzy” even if they did not notice the lack of contrast necessarily at first. If this works how I imagine it can, sometimes it really makes a strong impression that leads to increased focus on the topic.

Always provide a possible alternative where possible, or be ready to find one, even if it just to say “we could adjust our color contrast ratio in updates”.

Browser Extensions: operable from browser menu, installable from a store, some investment before return but moderately simple. Likely your users have installed an extension before which is a plus for simplicity. There are also degrees of trust signal available and you can see some ratings in the store before installing etc.

  • Accessibility Insights - Microsoft backed, well established, many people will be happy to install this.
  • HeadingsMap - Overall a pretty reputable tool, well used tool, from apparently an indie dev.

Extensions are understandable in the common working area. They are basically front facing. I can’t speak to the full accessibility of these extensions, but they are an option.

Extensions are somewhat transparent but allowing for some feeling of insight. Part of the good feeling of accepting a new tool is having some greater insight than what is obvious. Still issues: Will they retain it? Permissions? Still somewhat complex.

Bookmarklets: typically copied to the toolbar and may not run if external code is blocked. Not sure how many people are still familiar with these generally. Even if it feels on the surface to be the simplest installation, it is a very dev focused way of operating to feel like you know whats going on. Technically simpler than an extension but not practically simpler but still in some ways more accessible if extensions are a limitation.

Sa11y is an incredible outstanding example of a powerful bookmarklet, perfect for an interested person, particularly a dev.

Most are optimized for use by experts? embedded with terminology or functionality specific to accessibility testing. See ANDI as an example.

Anything in a DevTools tab: these tools may be the most optimal for report generation and detailed analysis but anything involving the DevTools tab introduces complexity in installation and use.

  • Don’t forget how much of a barrier anything labelled a DevTool is.
  • Issues with Freemium/Upsells inside extensions.
  • Not always simple/sometimes false positives.

Show them once, hopefully make an impression

  1. Isolate and reproduce an issue on your own with a variety of tools. Find a good one, to make your solid impression. Just be very sure before you raise the idea. Immediately ask yourself: can I propose a solution to go along with it?

  2. Introduce the issue framed in whatever you think appropriate, but emphasize that its easy to show. In rhetorical terms, it is not you telling them something, merely showing something anyone else can see. Mention you reproduced it in several tests but that this tool, (ideally a web app, extension or bookmarklet) is a way to confirm it with clear instructions:

Color Contrast Issue example: Warm introduction to WebAIM Permalink + Screenshot showing the two colors next to each other. Arguably the simplest example to show the importance of color contrast. The test result the way the color contrast appears, whether it passes or fails, and adjusting it to see live results. Simple stuff I love.

Headings issue: Warm introduction to HeadingsMap the browser extension for Chrome/Firefox. Easy enough, but some more barriers. For a slightly more mid level user. If they can see a way to browse headings on a page themselves they are more likely to think about how someone using assistive tech might browse those headings if they themselves are not aware of the importance of headings.

Broad spectrum test with a bookmarklet: Sa11y is your friend. But keep in mind the dev-focus of bookmarklets. What seems easy to install “just drag into your bookmarks tab” may feel magical to some people. However, it is a really intuitive experience itself to use once installed and clearly explains what it is showing. Notably it marks things as “good” if considered good- simple language.

  1. Be prepared to confirm and show follow ups if they ask for more information. Perhaps have a more advanced test result ready, but keep it ready until it is a good time. Don’t overplay your information just yet.

  2. Always have a potential solution suggestion at hand, even if it is simple like “adjust color contrast” or “adjust headings”.

Conclusion

If you keep your tests and results simple, I suggest it is possible to gain new ground in increasing focus on accessibility so you, and the organization or team, can do more than just merely pass tests. Good luck! Happy testing and showing people your results.

Deep thanks to everyone who reviewed this article.