Here is an uncomfortable truth: 94.8% of the top one million websites fail basic accessibility checks, according to the WebAIM Million 2025 report. That number has barely budged in five years despite growing awareness, new laws, and a flood of automated tools promising to fix everything. So what is going wrong? Mostly, people skip the hard parts. They run an automated scan, fix a few color contrast issues, and call it a day. But automated tools only catch between 30% and 57% of WCAG violations. The rest requires a human being to sit down, unplug the mouse, fire up a screen reader, and actually try to use the website. With the European Accessibility Act enforced since June 2025, ADA Title II digital requirements hitting in April 2026, and over 5,100 accessibility lawsuits filed in 2025 alone, the stakes have never been higher. This guide walks you through a complete accessibility audit process in eight concrete steps. No fluff, no theoretical frameworks you will never use. Just practical techniques that find real issues on real websites.
What Exactly Is a Web Accessibility Audit?
A web accessibility audit is a structured evaluation of a website against established accessibility standards, primarily the Web Content Accessibility Guidelines (WCAG). The goal is straightforward: identify every barrier that prevents people with disabilities from perceiving, understanding, navigating, and interacting with your site.
Think of it like a building inspection, but for your digital property. An inspector checks ramps, elevator buttons, door widths, and signage. An accessibility auditor checks keyboard navigation, screen reader compatibility, color contrast, form labels, and dozens of other criteria.
There are three types of accessibility audits, and understanding the difference matters because it shapes your expectations and budget.
Three Types of Audits
The first type is a purely automated audit. You run a tool like axe, WAVE, Lighthouse, or web-accessibility-checker.com across your pages and get a report of machine-detectable issues. This is fast and cheap, but it only catches 30% to 57% of WCAG violations depending on which study you reference. Missing alt text? Automated tools find it instantly. Poor alt text that says "image123.jpg" instead of describing the content? They have no idea. The second type is a manual audit performed by accessibility specialists. They navigate with keyboards, test with screen readers, evaluate content quality, and check interactions that no algorithm can assess. This catches the other 43% to 70% of issues but requires trained people and significant time. The third type, and the one you should aim for, is a hybrid audit combining both approaches. Start with automated scanning to catch the obvious violations at scale, then layer in manual testing for the nuanced issues. This is what we will walk through step by step. For a quick-reference version you can print out, see our accessibility audit checklist.Step 1: Define Your Audit Scope
Before touching any tool, you need to answer three questions. What pages will you test? Which standard will you measure against? And how big is your sample? For the standard, WCAG 2.1 Level AA is the baseline that virtually every accessibility law references. The European Accessibility Act mandates it. ADA lawsuits reference it. If you are just getting started, target WCAG 2.1 AA. If you want to go further, WCAG 2.2 added nine new success criteria in October 2023, including requirements for focus appearance, dragging movements, and consistent help placement. Our WCAG guide breaks down the differences in detail. For page selection, do not try to audit every single page on a large site. Instead, select a representative sample that covers every unique template and every critical user journey. That typically means: the homepage, main navigation pages, at least one page of each content type (article, product, listing, form), the complete checkout or conversion flow, login and account pages, search results, error pages, and any page with complex interactive components like modals, carousels, or data tables. For a site with under 50 pages, audit everything. For larger sites, a sample of 20 to 40 pages usually captures all unique patterns. The key insight is that fixing a template fixes every page using that template, so you get a cascade effect from focusing on templates rather than individual pages.Step 2: Run Automated Scanning
Start your audit with automated tools because they give you immediate, quantifiable results across many pages. Think of this as triage: you are identifying the most obvious problems before diving deeper. Run your site through web-accessibility-checker.com to get a quick baseline score. The free scan checks your page against WCAG criteria in under a second for DOM-based issues, with a deeper analysis following. You will get a breakdown by severity and WCAG principle that helps you understand the scale of work ahead. Then cross-reference with at least one other tool. Each automated scanner uses slightly different rulesets and heuristics, so combining two tools catches more issues than either alone. Good free options include WAVE (excellent for visual overlay of issues), axe DevTools (strong browser extension with detailed explanations), and Lighthouse (built into Chrome, good for performance plus accessibility). Pa11y works well for command-line integration and CI/CD pipelines. Document every finding in a spreadsheet or issue tracker. For each issue, record the page URL, the WCAG success criterion it violates, a screenshot or code snippet, the severity level, and which tool flagged it. This documentation becomes the backbone of your remediation plan. One warning: do not stop here. I have seen too many organizations run an automated scan, fix everything it finds, and declare themselves accessible. That leaves at minimum 43% of issues completely unaddressed. For a deeper comparison of testing tools, see our guide to the best accessibility checker tools.Step 3: Manual Keyboard Testing
Put your mouse in a drawer. Seriously. For this step, you navigate your entire sample using only the keyboard, and you will be surprised how quickly things fall apart on many websites.
Here are the keys you need: Tab moves forward through interactive elements. Shift plus Tab moves backward. Enter activates links and buttons. Space activates buttons, checkboxes, and toggles. Escape closes modals and dropdowns. Arrow keys navigate within components like menus, tabs, radio groups, and sliders.
Start on the homepage and press Tab repeatedly. Watch for these specific issues:
Focus visibility. Can you always see which element is currently focused? A visible focus indicator is required by WCAG 2.4.7 (Level AA). Many sites remove the default browser outline for aesthetic reasons and forget to replace it with something better. If you lose track of where you are on the page, that is a failure.
Tab order. Does the focus move through the page in a logical sequence? It should follow the visual reading order, generally left to right and top to bottom for LTR languages. If focus jumps erratically around the page, the underlying DOM order probably does not match the visual layout.
Keyboard traps. Can you always move away from the current element? Try tabbing into and out of every interactive component: menus, modals, date pickers, video players, embedded maps, carousels. If you get stuck and cannot Tab or Escape your way out, that is a keyboard trap and it is a Level A failure, the most serious kind.
Interactive functionality. Can you operate every feature? Open every dropdown, submit every form, play every video, dismiss every popup, complete every multi-step process. If any functionality is mouse-only, it is inaccessible.
Skip links. Does the site offer a skip to main content link that appears on Tab? Without it, keyboard users must Tab through the entire navigation on every single page.
Step 4: Screen Reader Testing
Screen reader testing reveals a completely different dimension of your site. Content that looks perfectly organized visually can be an incoherent mess when read aloud sequentially.
According to the WebAIM Screen Reader User Survey number 10, JAWS holds 41% market share on desktop, NVDA follows at 38%, and VoiceOver dominates mobile at 70.6%. You do not need to test with all of them, but you should test with at least one desktop reader and one mobile reader.
For practical purposes, start with NVDA on Windows (it is free) or VoiceOver on Mac (built in, activate with Command plus F5). On mobile, use VoiceOver on iOS or TalkBack on Android.
What to listen for during screen reader testing:
Page structure. When you navigate by headings (press H in NVDA or use the rotor in VoiceOver), do headings accurately reflect the content hierarchy? Are there skipped levels, like jumping from h2 to h4? Does every section have a meaningful heading?
Image descriptions. Navigate to every image and listen to the alt text. Is it descriptive and useful? An alt text of "photo" on a product image fails the test even though automated tools would mark it as having alt text present. Decorative images should have empty alt attributes so screen readers skip them entirely.
Form interactions. Tab through every form. Does each field announce its label? Are required fields indicated? When you submit with errors, are the error messages announced and associated with the correct fields? Can you navigate between error messages easily?
ARIA usage. This is where many sites create more problems than they solve. Pages with ARIA attributes actually had 34.2% more detected errors than pages without ARIA, per the WebAIM Million 2025. That is not because ARIA is bad but because it is frequently misused. Listen for incorrect role announcements, missing states, and widgets that announce one thing but do another.
Dynamic content. When content changes on the page, does the screen reader announce it? This includes form validation messages, loading states, notification banners, live chat widgets, and any content updated via JavaScript without a page reload. ARIA live regions handle this, but they need to be implemented correctly.
Step 5: Check Color and Contrast
Color contrast failures are the single most common accessibility issue on the web, affecting 79.1% of the top one million homepages according to WebAIM. The good news is that contrast issues are straightforward to identify and fix.
WCAG defines three contrast thresholds. Normal text (under 18pt or under 14pt bold) needs a 4.5 to 1 contrast ratio against its background. Large text (18pt and above, or 14pt bold and above) needs 3 to 1. User interface components and graphical objects that convey information need 3 to 1.
Use the Colour Contrast Analyser (free, from TPGi) to check specific color combinations, or run the contrast checks built into your automated tools from Step 2. Pay particular attention to text over images or gradients, where contrast can vary across the element. Placeholder text in form fields is notorious for failing contrast requirements.
Beyond contrast ratios, check that your site does not rely solely on color to convey information. Error messages should not just turn red; they need an icon or text label too. Charts and graphs need patterns or labels in addition to color coding. Link text within paragraphs should have a non-color indicator like an underline, not just a different color.
Step 6: Review Content Quality
This step goes beyond technical compliance into content effectiveness. Automated tools flag the presence or absence of elements, but they cannot judge quality. You need human eyes and judgment here. Alt text quality. Every informative image needs alt text that serves the same purpose as the image. A photo of a team meeting should describe relevant details, not just say "meeting." A graph should convey the key data point or trend it illustrates. Complex images like infographics may need a longer description via a linked text alternative. Heading structure. Your heading hierarchy should read like a table of contents. Screen reader users frequently navigate by headings to scan page content, so your headings need to be descriptive and properly nested. No skipping levels. No using heading tags just to make text bigger. Page language. Every page must declare its language in the HTML lang attribute. Content in a different language within the page (a French quote on an English page, for instance) needs a lang attribute on its container. Missing document language is one of the most common WCAG failures, showing up on a large percentage of pages in automated scans. Form design. Every form input needs a visible, associated label. Group related fields with fieldset and legend. Provide clear instructions before the form, not just inside it. Error messages should identify the problem specifically ("Email address must include an @ symbol") rather than generically ("Invalid input"). Make sure error recovery is easy and does not clear previously entered valid data. Data tables. Use proper table headers (th elements) with scope attributes. Avoid using tables for layout. Complex tables with merged cells need additional markup like headers attributes to remain understandable to assistive technology users. For broader context on compliance requirements, our ADA compliance guide covers the legal standards your content needs to meet.Step 7: Test on Mobile Devices
Mobile accessibility testing catches issues that desktop testing misses entirely. Over 60% of web traffic comes from mobile devices, and people with disabilities are heavy mobile users because smartphones have excellent built-in accessibility features.
Touch target size. WCAG 2.2 requires interactive elements to be at least 24 by 24 CSS pixels for Level AA conformance, with a recommendation of 44 by 44 pixels for AAA. Measure your buttons, links, form fields, and any tappable element. Pay special attention to elements that sit close together, like navigation links in a footer or action buttons in a list. Small, crowded targets are a nightmare for users with motor impairments.
Zoom and reflow. Zoom to 200% and then to 400%. Content should reflow into a single column without horizontal scrolling, without overlapping elements, and without losing functionality. Users with low vision frequently zoom, and a site that breaks at 200% fails WCAG 1.4.4.
Orientation. Your site should work in both portrait and landscape orientation unless a specific orientation is essential (which is extremely rare). Some users mount their devices in a fixed orientation due to physical limitations.
Touch gestures. If your site uses complex gestures like pinch, multi-finger swipe, or long press, provide single-pointer alternatives. A carousel that requires a swipe needs previous and next buttons too.
Test with VoiceOver on iOS and TalkBack on Android. The experience is notably different from desktop screen readers, and you may find issues specific to mobile assistive technology interaction patterns.
Step 8: Document and Report Your Findings
Everything you have found in steps 2 through 7 needs to be organized into an actionable report. This is not just documentation for its own sake; a well-structured report determines whether issues actually get fixed or sit in a backlog forever. A strong audit report contains six sections. Start with an executive summary that communicates overall conformance level and risk to non-technical stakeholders. Include a numerical score or grade and a clear statement of what it means. Decision-makers need to understand the situation in under two minutes. Next, document your scope and methodology: which pages were tested, which tools and techniques were used, which WCAG version and conformance level you measured against, and any limitations of the audit. Then organize findings by WCAG principle. Group issues under Perceivable, Operable, Understandable, and Robust. Within each principle, list the specific success criteria that failed. This structure aligns with how WCAG is organized and makes it easier for developers to find relevant guidance. For each individual issue, include the WCAG success criterion, the severity level (critical, serious, moderate, or minor), the affected pages or components, a clear description of the problem, a screenshot or code snippet demonstrating the issue, and a specific remediation recommendation with code examples where possible. After the findings, provide a remediation roadmap with prioritized phases. More on prioritization in the next section. Finally, include a retest plan that specifies when and how you will verify that fixes were implemented correctly. An audit without a retest cycle is only half the job. If you need to publish an accessibility statement after your audit, our accessibility statement template gives you a ready-to-use format.How to Prioritize What to Fix First
You have your audit report. It probably lists dozens, maybe hundreds of issues. Where do you start?
First, classify every issue by severity. Critical issues block access entirely: a keyboard trap that prevents checkout, a form with no labels at all, a video with no captions. Serious issues make tasks very difficult but not impossible. Moderate issues cause frustration. Minor issues are technically non-conformant but have limited real-world impact.
Then apply a prioritization formula: scope multiplied by impact multiplied by effort. Scope asks how many pages or users are affected. Impact asks how severely the issue affects task completion. Effort asks how much work the fix requires. High scope, high impact, low effort issues go to the top of the list.
In practice, here is the order that works best. Fix template-level issues first because they cascade across every page using that template. A missing skip link in your header template? Fix it once, and every page is fixed. That is enormous leverage.
Next, fix critical issues on high-traffic and high-conversion pages. Your homepage, product pages, checkout flow, and contact forms deserve priority because they affect the most users and the most revenue.
Then work through serious and moderate issues systematically. Group them by component: fix all heading issues at once, then all form issues, then all image issues. Batching by type is more efficient than fixing page by page.
Save purely cosmetic or edge-case minor issues for a maintenance phase. They matter, but they should not delay the more impactful work.
Best Tools for Accessibility Auditing
Having tested dozens of tools across hundreds of audits, here is what actually works in practice. For free tools, WAVE is excellent for visual learners because it overlays icons directly on your page showing where issues occur. axe DevTools is the gold standard browser extension for developers, with clear explanations and code-level detail. Lighthouse is convenient because it is built into Chrome and combines accessibility with performance and SEO checks. Pa11y is powerful for CI/CD integration, letting you automate accessibility checks in your deployment pipeline. The Colour Contrast Analyser from TPGi handles contrast checking with an eyedropper tool that works across any application. Web-accessibility-checker.com offers a free scan that checks your page in under a second for DOM issues and follows up with a deeper PageSpeed Insights analysis. The paid monitoring plans add scheduled scanning, historical tracking, and multi-page coverage. For paid enterprise tools, Siteimprove is comprehensive but expensive at around $28,000 per year. AudioEye starts at $45 per month for automated monitoring. axe Monitor from Deque extends the free axe engine with dashboard reporting and team features. Level Access provides full-service auditing with expert consultants. The honest truth is that no single tool is sufficient. Combine a free scanner for quick checks, a browser extension for development, and manual testing for completeness. The best tool is the one your team actually uses consistently.What Automated Tools Cannot Catch
Understanding the limits of automation makes you a better auditor. Here are the categories that consistently require human judgment.
Alt text quality is the classic example. Automated tools verify that alt attributes exist but cannot evaluate whether they are meaningful. An image with alt text set to "DSC_0042.jpg" passes automated checks but fails real users completely.
Keyboard trap detection is partially automatable but unreliable. Automated tools can check basic tab navigation, but complex widgets with conditional focus management often need a real person pressing keys to discover traps.
Reading order versus visual order. CSS Grid and Flexbox make it easy to create layouts where the visual order differs from the DOM order. Automated tools check the DOM; only a human using a screen reader or Tab key notices when things are read out in the wrong sequence.
Cognitive accessibility is almost entirely a human judgment call. Is the language too complex? Are instructions clear? Is the navigation predictable? Are error messages helpful? These questions require understanding context and user expectations.
ARIA correctness beyond syntax. Automated tools can verify that an ARIA attribute has a valid value, but they cannot tell you whether role="button" on a div that looks and acts like a link is semantically wrong. They cannot determine if an aria-label contradicts the visible text. The WebAIM Million found that pages using ARIA had 34.2% more errors than pages without it, largely because developers use ARIA incorrectly.
Real user experience. No tool can tell you whether your checkout flow is actually usable with a screen reader. That requires a human walking through the full process, encountering the real sequence of interactions, and evaluating whether the overall experience makes sense.
The Most Commonly Failed WCAG Criteria
Knowing the most common failures helps you focus your audit and calibrate your expectations. The WebAIM Million 2025 analyzed the homepages of the top one million websites and found these leading issues. Low contrast text appeared on 79.1% of pages. This is the most prevalent accessibility failure on the web by a wide margin. Light gray text on white backgrounds, trendy low-contrast color schemes, and insufficient placeholder text contrast are the usual culprits. Missing alternative text for images affected 55.5% of pages. This includes images with no alt attribute at all and images with empty alt on non-decorative content. Missing form input labels affect a huge number of pages. When a form field has no programmatic label, screen reader users have to guess what information to enter. Placeholder text is not an acceptable substitute for a label because it disappears when you start typing and has poor contrast in most browsers. Empty links and empty buttons are links or buttons that contain no accessible text. A link wrapping only an icon with no alt text or aria-label, or a button with only a background image, fails this criterion. Missing document language is one of the easiest issues to fix. Adding lang="en" to your HTML tag takes five seconds, yet a substantial percentage of pages still omit it. Without it, screen readers may use the wrong pronunciation rules for the entire page. All of these are Level A violations, the most basic level of accessibility. If your site fails at Level A, it has fundamental problems that need urgent attention. Our EAA compliance page explains how these criteria map to European legal requirements.Cost, ROI, and Making the Business Case
Accessibility audits have clear costs and even clearer returns. Understanding the numbers helps you secure budget and justify the investment.
For costs, a professional audit of a small to medium website typically runs between $1,250 and $5,500 depending on scope and complexity. Enterprise-level audits with hundreds of templates and complex applications range from $25,000 to $40,000 or more. DIY audits using the process in this guide cost primarily in staff time.
Now consider the risk side. The average settlement for an ADA web accessibility lawsuit is around $25,000, and that does not include legal fees, remediation costs, or reputational damage. With over 5,100 lawsuits filed in 2025, this is not a theoretical risk.
The return on investment numbers are compelling. Forrester research found approximately $100 return for every $1 invested in accessibility improvements. Sites that improve accessibility see an average 23% increase in organic traffic because accessibility improvements align closely with SEO best practices: clean HTML, proper headings, descriptive link text, fast load times. Building accessibility in from the start saves 67% compared to retrofitting later.
Perhaps the most striking number: cart abandonment rates drop from approximately 69% on inaccessible e-commerce sites to about 23% on accessible ones. When people can actually use your site, they buy things. That alone can justify the entire cost of an audit and remediation program many times over.
The business case writes itself. An audit is not an expense. It is an investment that reduces legal risk, expands your market, improves SEO, and increases conversions. The only question is whether you do it proactively or wait for a lawsuit to force your hand.