HomeSoftware TestingSmart Home
 
  

Implementing a Minimum Accessibility Test Plan

January 11, 2021

Implementing a Test Plan for Accessibility Testing

Accessibility testing is similar to functional testing where you test defined expected behavior against the software under test. However, there are some unique challenges with automated accessibility testing.

For one, accessibility for websites has not been a focus of many companies until recently. This means you have a testing model where the product is already built against different expected requirements and behaviors. At minimum this means there is going to be rework for the developers.

I don't advocate for surprising developers with your "gotcha" test cases, the ones you bring out to find those really awesome defects at the last minute. It leads to having an adversarial relationship between developers and test engineers. Agreeing on the requirements, what, and how you will be testing before developers get started leads to more collaboration and less rework.

As of writing, there are 90 rules in aXe 4.1 and just testing against the full list isn't an efficient way to do this. You'd end up flooding your backlog with accessibility defects and probably stress out your developers and analysts.

The solution I'd propose is to create a minimum accessibility test plan, get agreement that this is the minimum expected behavior of the software under test, and then automate those test cases to ensure the accessibility issues don't come back.

Creating a Minimum Accessibility Test Plan

A minimum accessibility test plan will establish your foundation for making your website more accessible to users. You should have an end goal established such as eventually reaching WCAG 2.1 AA compliance. The minimum accessibility test plan will be your first milestone along the way.

I'm not a huge fan of formal test plans and strategies in an agile SDLC, lengthy formal documentation takes too long, but I believe your test plan should answer

  • Who the user(s) is/are (e.g., someone reliant on a screen-reader)
  • How they would use the system
  • Scope of testing (entire site? or certain logical flows?)
  • How will the tests be run? (automated, manual, both)
  • How often will the tests be run?
  • Who will run them? (pipeline, STE, developer?)

Once you have the test plan you can iterate on it with the product team and developers to ensure common understanding and gain agreement. This is a healthier collaborative approach to testing that should lead to less rework.

I'd recommend using the most impactful rules and guidance from the axe rule set. Have a goal like improving the navigation experience for users that rely on screen readers in mind.

Example requirements for minimum accessibility test plan

  • Each page should have a relevant title (Axe rule 42)
  • Each page should have relevant, to the content, section headers (Axe rule 73)
  • Verify text has proper color contrast (Axe rule 83)
  • Verify site is navigable by keyboard alone (Axe rule 46)
  • Verify images have descriptive alternative text (Axe rule 64)
  • Ensure forms can be filled in by keyboard alone
  • Ensure forms are not confusing when navigated with screen reading software

Once you've established an agreed upon minimum accessibility test plan you can turn them into automated accessibility test cases.

Automating your Accessibility Test Plan

Automating your accessibility tests is important because many accessibility violations are not as obvious as functional defects. For example, many of the accessibility features are not visible in the rendered markup of the page which is where automation can save you a lot of time. In addition, if you run your automation regularly your tests will ensure regression defects around your established accessibility requirements don't sneak in later.

My favorite tool for implementing automating accessibility testing is Nightwatch.js combined with the nightwatch-axe-verbose NPM package. I cover installing them in my article accessibility testing with Nightwatch.js.

The nightwatch-axe-verbose framework, unlike other accessibility assertion libraries, will not halt execution on the first rule failure. It will report on all rule violations you specify within the context of page you specify. This is great because it gives you a full picture of what needs to be remediated.

Further, the combination of Nightwatch.js and the nightwatch-axe-verbose package allows you to run a subset of the 90 or so axe rules in your automated tests which allows it to conform to your minimum accessibility test plan requirements.

You can accomplish running a subset of accessibility assertions using two different ways.

  • Disable specific tests from the axe ruleset
  • Run only specific rules from the ruleset

The former more aligns with the minimum accessibility test plan approach. Below are both examples.

Passing rules to exclude or disable

'Run everything except contrast and region': function (browser) {
browser
.url('@homePage')
.axeInject()
.axeRun('body', {
rules: {
'color-contrast': {
enabled: false
},
'region': {
enabled: false
}
}
})
.end();
}

Passing specific rules to include

'Run these rules only': function (browser) {
browser
.url('@homePage')
.axeInject()
.axeRun('body', {
runOnly: ['color-contrast', 'image-alt']
})
.end();
}

The full list of rule names can be found in the axe core rule descriptions. To learn more about configuring your test project to run Nightwatch.js with nightwatch-axe-verbose watch this video on Nightwatch.js accessibility testing.

Test suite style considerations

Good test suites should be readable, their intention clear, and allow for easy maintenance. To keep a test case's, intention clear I typically try to balance testing one thing per test where it makes sense balancing that with efficiency. With accessibility tests using the axe framework you can cover the entire page, subsections, or specific html elements in just one test.

In addition, as shown above, you can run one or many tests against that element. In the examples above the test assertions were run against the body element of the page so all the axe rules are cascaded and run against all the html elements inside body, effectively running one test against the entire page.

This provides you with a lot of assertion coverage using fewer tests.

nightwatch-axe-verbose has good reporting so if you do go with that style it tells you how many elements in the page passed.

It breaks out failures individually with the element identifier so you get a complete picture of each rule violation per element which is useful for remediation.

√ Passed [ok]: aXe rule: region (296 elements checked)
× Failed [fail]: (aXe rule: region - All page content must be contained by landmarks
In element: h1)
...

Still, this can be a lot of information so maybe it makes sense to write our automated minimum accessibility tests in a style like the one below where it is more clear what rules the tests are looking at by their test name and the tests themselves are run against more specific sections or elements of the page.

module.exports = {
beforeEach: function (browser) {
browser.page.login().navigate();
},
'Login page has descriptive title': function (browser) {
browser
.assert.title('Please login to BizCorp');
},
'Logo has alt text': function (browser) {
browser
.axeInject()
.axeRun('#mainLogo', {
runOnly: ['image-alt']
})
.end();
},
'Login page has accessible headers': function (browser) {
browser
.axeInject()
.axeRun('body', {
runOnly: ['empty-heading', 'heading-order', 'page-has-heading-one',
'p-as-heading']
})
.end();
}
}

There probably isn't a one-size fits all approach to this. Weigh your current time constaints for implementing the automation against future costs of maintenance and who will be running the tests behind you to find the balance.

Don't forget state changes

Most likely you have pages that have javascript that will update the look and feel of the page based on user interaction. You will want to ensure you test those scenarios too.

For example, if you submit a form with invalid or missing information does a validation callout appear on screen? That page state should be tested as well.

For example, is the error marked up appropriately where it makes sense to someone using a screen reader? Would someone know what field is in error from the message shown? An error stating this field is required without visual context may be confusing.

Workflow considerations - Do I file a defect?

So what does one do if they are testing a legacy product, one without prior accessibility requirements, and find accessibility defects? I'd argue that if there was no accessibility requirement when product was developed it should be treated as a new feature request and not a bug.

It makes more sense to me to play stories enhancing the accessibility of the product as a feature for legacy products instead of just burying the backlog with defects.

This allows you and your team to work the story like any other piece of functionality. The requirements can come from the newly agreed upon accessibility test plan, coded, and verified by the tester.

However, it should be considered a defect if the automated accessibility tests catch a regression after the requirements are in place.

Any new features made after the minimum accessibility plan is in place should have this as an assumed requirement. Perhaps consider attaching the a11y requirements to stories as a reminder.

A11y violations on these new features not caught in development should be treated as defects.

Final Thoughts

Creating a minimum accessibility test plan is an important first step in making your website a11y compliant. Having a visible document your team can collaborate on will help with buy in. Many of the most impactful changes are easy to implement and make your site design better for all users.

Use automated tests to efficiently scan your site for violations against your accessibility test plan and to prevent from backsliding with regression defects.

Finally, find or be the accessibility advocate for your company and show how making your site more accessible leads to higher usage and protects the company from perils of non-compliance.

I've posted the a working example of what I described in this article on my GitHub 👉 Implementing a minimum accessibility test plan using Nightwatch, aXe, and nightwatch-axe-verbose tutorial

To stay in touch, subscribe to the reallyMello channel on YouTube and check our my social links below. Thanks!

Photo of David Mello

David Mello

Breaker of software, geek, meme enthusiast.

 

DavidMello.com © 2023