Google Lighthouse is built into Chrome DevTools and runs with a single click. Its accessibility score — that green or yellow number between 0 and 100 — is familiar to every web developer. But what does that score actually mean for WCAG 2.1 compliance?
This post explains exactly what Lighthouse checks, what it misses, and when you need a dedicated accessibility scanner like Accessalyze alongside it.
See how 321 websites scored →
View the 2026 ReportLighthouse checks approximately 30 accessibility rules. WCAG 2.1 AA has 50 success criteria. A perfect Lighthouse score leaves significant gaps — and those gaps are what ADA lawsuits are filed over.
Lighthouse is an open-source automated auditing tool built by Google. It's available in Chrome DevTools under the "Lighthouse" tab, as a CLI tool, and as part of PageSpeed Insights. It audits pages for performance, SEO, best practices, PWA — and accessibility.
The Lighthouse accessibility audit is powered by axe-core, an open-source accessibility rules engine developed by Deque Systems. Lighthouse exposes a curated subset of axe-core rules, weighted into a score from 0 to 100.
Key facts about Lighthouse's accessibility audit:
Lighthouse grades each page on a 0–100 scale. The score is calculated by weighting each rule by severity — a missing form label might count for more than a missing lang attribute on a nested frame. Passing all rules gives you 100.
Here's the problem: the score only reflects the ~30 rules Lighthouse runs. It tells you nothing about the WCAG criteria that aren't covered by those rules. A site with no WCAG 1.1.1 (alt text) issues, no color contrast failures, and no missing form labels can score 100/100 on Lighthouse while still having significant WCAG violations around time-based media, keyboard traps, or focus order.
| WCAG Category | Lighthouse | Accessalyze |
|---|---|---|
| Images: alt text (1.1.1) | ✓ | ✓ |
| Color contrast (1.4.3) | ✓ | ✓ |
| Form labels (1.3.1, 3.3.2) | ✓ | ✓ |
| Button names (4.1.2) | ✓ | ✓ |
| Link name / purpose (2.4.4) | ✓ (partial) | ✓ |
| Language of page (3.1.1) | ✓ | ✓ |
| ARIA attribute validity (4.1.2) | ✓ | ✓ |
| Duplicate IDs (4.1.1) | ✓ | ✓ |
| Heading order (1.3.1, 2.4.6) | ◑ Basic only | ✓ |
| Keyboard focus visible (2.4.7) | ✗ Not fully checked | ✓ |
| Skip navigation links (2.4.1) | ✗ | ✓ |
| Time-based media captions (1.2.2) | ✗ | ◑ Flags media elements |
| Keyboard traps (2.1.2) | ✗ | ◑ Detects common patterns |
| Focus order (2.4.3) | ✗ | ✓ |
| Meaningful sequence (1.3.2) | ✗ | ✓ |
| Consistent navigation (3.2.3) | ✗ | ✓ |
| Error identification (3.3.1) | ✗ | ✓ |
| Status messages (4.1.3) | ✗ | ✓ |
✓ = Checked · ◑ = Partially covered · ✗ = Not checked. Neither tool replaces manual testing for all WCAG criteria.
WCAG 2.4.1 (Level A) requires a mechanism to skip repeated navigation blocks — typically a "Skip to main content" link. Lighthouse does not check for this. Keyboard-only users who navigate page-by-page are significantly impacted when skip links are missing. Accessalyze flags absent skip navigation.
When a user navigates with Tab, every focused element must have a visible focus indicator. Lighthouse does not reliably check whether your CSS has hidden the default browser focus ring with outline: none. Missing focus indicators are a common lawsuit trigger because they completely break keyboard navigation. Accessalyze detects CSS patterns that suppress focus visibility.
The order in which Tab moves focus through interactive elements must match the visual reading order. CSS positioning, flexbox reordering, and tabindex values can create a focus order that jumps around the page confusingly. Lighthouse doesn't check this; Accessalyze analyzes DOM order vs. visual order.
When a form submission fails, errors must be identified in text (not just color or icon). Lighthouse doesn't analyze error states — it can only see the initial page load. Accessalyze includes rules for ARIA live region usage and error announcement patterns that catch common 3.3.1 failures.
Navigation that appears on multiple pages must remain in the same relative order. This is inherently a multi-page check — Lighthouse only audits one page at a time and can't compare across pages. Accessalyze's multi-page scanning can flag inconsistencies across templates.
| Feature | Lighthouse | Accessalyze |
|---|---|---|
| Built into Chrome DevTools | ✓ | ✗ (URL-based scan) |
| Accessibility rule count | ~30 rules | Full WCAG 2.1 AA ruleset |
| Accessibility score | ✓ 0–100 weighted score | ✓ Per-page issue count + severity |
| AI fix code | ✗ | ✓ Generated for each issue |
| Multi-page scanning | ✗ One page at a time | ✓ |
| CI/CD integration | ✓ Via CLI / GitHub Action | ✓ Via REST API / GitHub Action |
| Continuous monitoring | ✗ | ✓ |
| Shareable compliance reports | ◑ PageSpeed Insights link | ✓ Permanent report links |
| Performance + SEO audit | ✓ (Lighthouse's main strength) | ✗ Accessibility only |
| Cost | Free | Free scan; paid plans for API + monitoring |
If you're a developer using Chrome, Lighthouse is already there. Open DevTools, go to the Lighthouse tab, click Generate Report. No account, no API key, no CLI install. For a quick check during development, the friction is essentially zero.
Lighthouse gives you performance, SEO, best practices, and accessibility in one report. If you're already running Lighthouse for Core Web Vitals or SEO, checking accessibility at the same time adds no extra step. Accessalyze is accessibility-only — you'd still need Lighthouse for the rest.
The Lighthouse CLI is well-documented and widely used in CI pipelines. You can set score thresholds and fail builds automatically. Accessalyze also supports CI integration via its API, but if your team already has Lighthouse in CI, there's no reason to remove it — add Accessalyze for deeper accessibility coverage.
Because Lighthouse runs in your Chrome profile, it inherits your cookies and session state. Running it on a localhost dev server with a logged-in session works out of the box. Accessalyze requires explicit configuration for auth-gated pages.
Accessalyze checks the full WCAG 2.1 AA ruleset that can be automated — not just a curated subset. The additional coverage matters most for keyboard accessibility (focus indicators, skip links, focus order), ARIA patterns, and multi-page consistency checks.
For every violation found, Accessalyze generates the corrected code. A developer looking at a Lighthouse report has to go research the fix; a developer looking at an Accessalyze report gets the fix immediately. This saves significant remediation time at scale.
Lighthouse is a point-in-time snapshot. You have to remember to run it. Accessalyze can run on a schedule and alert you when a new accessibility regression appears after a deployment — without anyone manually triggering a test.
Lighthouse audits one page at a time. Accessalyze can crawl an entire site — scanning dozens or hundreds of pages in a single job and aggregating the results. For sites with many templates, a single page audit gives you a false sense of coverage.
Keep Lighthouse in CI for performance and SEO — it catches the automated accessibility checks it covers, and it's already in your workflow. A Lighthouse score below 90 on accessibility is a red flag worth addressing immediately.
Add Accessalyze for deeper WCAG coverage — run it on your key pages to catch the issues Lighthouse misses (focus visibility, skip navigation, focus order, error handling patterns). Schedule nightly monitoring to catch regressions.
If you're managing ADA compliance risk — meaning you need to document that you've taken reasonable steps to meet WCAG 2.1 AA — Lighthouse alone is not sufficient. Accessalyze fills the gap and provides shareable compliance reports.
Here's a realistic scenario that illustrates the gap:
role="dialog"), a proper heading, and labeled buttons — so Lighthouse passes it.No. A Lighthouse accessibility score reflects a subset of WCAG criteria. ADA compliance for websites is generally interpreted to require WCAG 2.1 AA, which has 50 success criteria. Lighthouse checks roughly 30 rules that can be automated — it doesn't cover many keyboard accessibility, timing, and multi-page consistency requirements. A Lighthouse score of 100 does not constitute legal compliance documentation.
Yes. Lighthouse's accessibility audit is powered by axe-core, the same open-source engine used by many accessibility tools. Accessalyze also incorporates axe-core rules and extends them with additional WCAG criteria checks. A tool that uses axe-core may check different subsets of the available rules.
Yes — the Lighthouse CLI supports CI integration, and there are GitHub Actions available for it. You can set thresholds and fail builds on accessibility score drops. Accessalyze also offers GitHub Action integration with per-violation detail rather than just a score threshold.
A score of 90+ is a useful minimum threshold, but it's not a compliance target. Scores below 90 almost certainly indicate real WCAG Level A or AA issues. Scores of 90–100 may still have significant violations in areas Lighthouse doesn't cover. Treat 100/100 as a floor, not a ceiling, and use additional testing to catch what Lighthouse misses.
No. They serve different purposes. Lighthouse provides performance, SEO, and PWA audits in addition to accessibility — Accessalyze only does accessibility. Keep using Lighthouse for what it does well. Add Accessalyze when you need deeper WCAG coverage, monitoring, fix code, or multi-page scanning.
Run a free WCAG 2.1 accessibility scan in under 60 seconds. No signup required. Get the issues Lighthouse doesn't check, with AI fix code for each one.
Scan My Website Free →← Back to Blog · Run a Free Accessibility Scan · Full Accessalyze vs Lighthouse Comparison
Try it yourself
Enter your website URL to get a free accessibility score.