Manual Accessibility Testing Guide
Automated tools catch 30-40% of accessibility issues. Manual testing covers the rest. This guide walks you through screen reader testing, keyboard navigation, and cognitive evaluation — complementing CompliScan's automated scanning.
No signup required. Results in under 60 seconds.
Why Manual Accessibility Testing Is Essential
Automated accessibility testing with tools like CompliScan catches 30-40% of WCAG 2.1 issues — the objective, machine-verifiable criteria like missing alt text, contrast failures, and broken ARIA attributes. The remaining 60-70% requires human judgment: Is the alt text actually meaningful? Is the reading order logical? Can a screen reader user understand the page flow? Is the language clear enough for people with cognitive disabilities?
With ADA lawsuits exceeding 10,000 per year and the April 24, 2026 Title II deadline, organizations cannot rely solely on automated scanning. Plaintiffs test websites with actual assistive technology — screen readers, switch devices, and voice control — and find issues that automated tools miss. Manual testing is what separates "we ran a scan" from "we actually tested with real users."
Screen Reader Testing
Screen readers convert visual content to speech or Braille output, enabling blind and low-vision users to navigate the web. Testing with a screen reader reveals issues invisible to automated tools:
- NVDA (Windows, free) — the most popular free screen reader; test your site's navigation, forms, and content reading order
- VoiceOver (macOS/iOS, built-in) — essential for testing Safari and mobile accessibility; activate with Cmd+F5 on Mac
- JAWS (Windows, paid) — the enterprise standard used by many blind professionals; important for B2B and government sites
Key screen reader tests to perform:
- Navigate the entire page using heading navigation (H key in NVDA/JAWS) — does the heading hierarchy make sense?
- Fill out every form using only the screen reader — are labels announced, errors described, and submit confirmations clear?
- Navigate all interactive components — do menus, modals, tabs, and accordions announce their state correctly?
- Listen to image alt text — does it convey the image's purpose, or is it generic ("image.jpg")?
Keyboard Navigation Testing
Approximately 8 million people in the U.S. have motor disabilities that prevent mouse use. Keyboard testing ensures every interactive element is reachable and operable:
- Tab through the entire page — every link, button, form field, and interactive element should be reachable via Tab key in a logical order
- Check focus indicators — can you always see which element has keyboard focus? Focus should be clearly visible with a high-contrast outline
- Test all interactions — dropdown menus should open with Enter/Space, close with Escape; modals should trap focus and return it when closed
- Skip navigation — the first Tab press should reveal a "Skip to main content" link that bypasses the navigation
- No keyboard traps — you should never get stuck on an element that you cannot Tab away from
Keyboard testing is the single most impactful manual test you can perform. If your site works with a keyboard, it likely works with most assistive technologies. Run CompliScan first to catch the automated keyboard issues (missing tabindex, improper roles), then manually verify the full keyboard experience.
Combining Manual and Automated Testing
The most effective accessibility practice combines automated scanning (CompliScan) with structured manual testing:
- Start with CompliScan — run a free automated scan to identify machine-detectable violations and get AI-powered fix suggestions
- Implement automated fixes first — resolve the issues CompliScan identifies before starting manual testing; this eliminates noise and lets you focus on human-judgment issues
- Perform manual testing quarterly — screen reader testing, keyboard navigation, and cognitive review on key user flows (homepage, forms, checkout, account pages)
- Monitor continuously — CompliScan's paid plans ($49-299/month) run scheduled scans that catch automated-detectable regressions between manual test cycles
This workflow ensures continuous automated coverage supplemented by periodic human evaluation. Document both automated scan results and manual testing findings to build a compliance record that demonstrates good-faith effort — a key legal defense in ADA litigation.
Frequently Asked Questions
Can I skip manual testing if I use CompliScan?
No. CompliScan catches 30-40% of WCAG issues through automated testing. Manual testing covers the remaining 60-70% — including reading order, alt text quality, cognitive clarity, and complex interaction patterns. Both are necessary for genuine accessibility compliance.
Which screen reader should I test with?
Test with at least two: NVDA (free, Windows) and VoiceOver (built-in, macOS/iOS). If your audience includes enterprise or government users, add JAWS testing. Each screen reader interprets ARIA and HTML slightly differently, so multi-reader testing catches more issues.
How long does manual accessibility testing take?
A basic manual test of a key page (keyboard navigation + screen reader walkthrough) takes 30-60 minutes per page. A full site audit covering 10-20 key pages takes 2-4 days. Running CompliScan first and fixing automated issues reduces manual testing time by eliminating the easy-to-detect violations upfront.
Do I need to hire an accessibility expert for manual testing?
For basic manual testing (keyboard navigation, screen reader walkthroughs), your development or QA team can learn the skills with practice. For comprehensive VPAT documentation, legal compliance assessment, or usability testing with disabled users, an accessibility consultant adds significant value.
More Free Tools
Check Your Website Now
Enter your URL below and get a free accessibility report with AI-powered fix suggestions in under 60 seconds.
No signup required. Results in under 60 seconds.