Technology | Jill Power| April 24, 2018
Jill Power, EBSCO Senior Product Manager, provides accessibility testing tips for librarians looking to meet website compliance and better serve their end users.
With all the advances in technology, I frequently get asked if EBSCO simply tests our software code to assess accessibility and website compliance. While the software for accessibility testing has come a long way, there is still no silver bullet that will enable you to provide a completely accessible experience for users. We can test the code for many of the accessibility tags and the page structure that is required, and there are a number of excellent testing solutions available today that weren’t available five years ago, but why isn’t that enough?
It all comes down to usability. And website compliance weighs in usability. So, what is the best way to test your website or online resource to ensure that it is accessible for your users? A combination of automated tools and manual testing with accessibility is your best bet. Below are a few tips to get you started.
With all the advances in technology, I frequently get asked if EBSCO simply tests our software code to assess accessibility and website compliance. Unfortunately, the answer is no.
Most tools for testing a live webpage for accessibility come in the form of a browser extension. The most useful browser extension for an individual with little experience with code is WAVE created by WebAIM (Web Accessibility in Mind). Once installed, with a click of a button on your toolbar, WAVE does an assessment of your webpage, visually flagging all elements related to accessibility. WAVE calls out not only the errors on the page (those that fail compliance) but it also uses warning flags to highlight potential areas for error. It visibly displays all the navigation headings on the page, as well as other structural elements. Another tool to consider, as users get more comfortable with the code behind accessibility, is the HTML CodeSniffer. This tool evaluates the page in a similar manner as WAVE, however it doesn’t provide as much as a visible indicator. Instead, it creates an interactive report which identifies the issue, failed guideline and links to W3C guidance. If you’re feeling adventurous, another browser extension tool is aXe by Deque Corporation. This tool brings the accessibility assessment into the Web Developers Toolbar (F12), allowing testers to view the html code, analyze the page and identify compliance issues with the code.
These tools will flag accessibility compliance issues such as heading structure and order, presence of alt-text, presence of page title and language, color contrast and proper html semantic structure.
Screenshot of WAVE tool with visible indicators and summary of issues.
Screenshot of HTML CodeSniffer Alert Box with counts for errors, warnings and notices.
Screenshot of aXe extension as part of the Web Developer Tools highlighting the violations found and source of code.
After completing tests using the automated browser extension tools, the next step is to test with assistive technology. Hands-on testing is always the best way to truly assess the user experience. Screen readers are the main testing component that requires practice. Screen readers allow a user with visual disabilities to experience the page through an audio representation of the page. When using these tools, testers need to slow down the reading rate and listen to the page, make very deliberate movements and selections so as not to be overwhelmed. The most common screen readers used today, according to a WebAIM annual survey, are JAWS, NVDA and VoiceOver. JAWS, although probably the most complicated of the three, is still a leading screen reader solution, so it’s important to consider reviewing webpages with this solution. NVDA is a free desktop solution that provides a text viewer so users can read while hearing the screen reader. VoiceOver is a native iOS screen reader. Other useful tools include using the Zoom feature (Ctrl+) on a PC to magnify the entire screen you’re viewing, and the Magnifier feature on a PC that has the option to view through a lens. iOS also has similar built-in devices. These features work similar to purchasing a for-fee product like Zoomtext.
Manual testing will help identify issues such as whether the heading structure matches the desired content order of the page, the alt-text accurately conveys the same information as the related image, whether pop-ups, dialogs or other notifications are consistent and obvious, regardless of how the user is experiencing the page (screen reader or magnification).
Lastly, but just as important, is to make sure that testers ask actual users to review webpages. Using assistive technologies for testing provides a better understanding of how a user experiences the site, and will likely help identify some areas of improvement. However, nothing truly replaces an end user who depends on assistive technology for their computer interactions. They are native users and will have different habits and expectations. It may not be feasible to run all updates and features by users directly, but you should make sure to schedule periodic reviews of the pages to make sure the main content and functions are available to them.
For more information and demonstrations on how testers can assess webpages, come to Content Creation to Content Delivery: Partnering to Improve E-book Accessibility at NASIG’s annual conference in June.
Jill Power has over 15 years of experience in both product management and library services. At EBSCO, she has made it her mission to understand the users, any assistive technologies they leverage, and the way they access information and conduct research. She is an advocate for accessibility at EBSCO and for ensuring a continuous approach to making all their products accessible to all users.
Your comment will be reviewed by a moderator for approval.