Lighthouse scores and AEO: the connection nobody makes
Lighthouse measures more than you think
I'm going to say something you might not expect from someone who has been building websites for over twenty years: Google Lighthouse is one of the most underrated tools out there. Not because it's a brilliant instrument, but because people use it for the wrong purpose.
They run a scan. They see green. They think: done.
But hold on. That's a bit like looking at the outside of a birdhouse and concluding it's fine. When really, you should be checking whether there are actually birds living in it.
What I've seen across dozens of Kobalt audits over the past two years honestly surprised me: sites with a Lighthouse total score above 90 consistently score higher on our AEO scan. Not slightly. Consistently. And that's no coincidence.
Lighthouse is an open-source tool from Google that measures Performance, Accessibility, Best Practices and SEO. You run it via Chrome DevTools, the CLI or PageSpeed Insights. Scores range from 0 to 100, with 90+ considered good. And yes, I once spent an entire Friday evening getting that one point from 89 to 90 for a client. My wife was less impressed than I was.
Why Lighthouse and AEO correlate
Let me think out loud here for a moment. Because the correlation exists, but why exactly?
AI crawlers like GPTBot and ClaudeBot behave like an automated browser without a JavaScript engine. They read HTML, follow links, evaluate what they see. A slow, bloated page is just as frustrating for them as it is for a user on shaky train WiFi.
The four Lighthouse categories each touch a signal that AI models also value:
- Performance: fast TTFB and low LCP mean the page is quickly and fully available. Slow servers miss crawl windows. Simple.
- Accessibility: correct ARIA labels, alt texts and semantic HTML are directly usable as context signals. This is gold for AI.
- Best Practices: HTTPS, no mixed content, no deprecated APIs. Trust signals that AI source evaluators factor in.
- SEO: title tags, meta descriptions, canonical tags. The classic foundation that helps AI interpret pages correctly.
These are the same quality signals. Lighthouse measures them on one scale. AEO optimization targets them for a specific purpose. The overlap? Massive.
What the Kobalt data shows
In 2025, we conducted audits at Kobalt on 34 websites across various sectors. For each we measured the Lighthouse score (desktop, no throttling) and the AEO score via our scanner. The results were... well, they spoke for themselves.
- Lighthouse below 60: average AEO score of 31. That is downright poor.
- Lighthouse between 60 and 80: average AEO score of 52. Better, but still mediocre.
- Lighthouse between 80 and 90: average AEO score of 67. Now we're getting somewhere.
- Lighthouse above 90: average AEO score of 79. That's the difference between invisible and relevant.
Is that causal proof? No. But the direction is so clear you can't ignore it. A poor Lighthouse score is almost always a symptom of broader quality issues that also affect your AI visibility.
What you actually do with these insights
Enough theory. Here's what you can do tomorrow morning (or tonight, if you're anything like me):
- Run Lighthouse on your top 10 pages via PageSpeed Insights. Record the scores per category. Not just the homepage, also your key landing pages and blog posts.
- Fix Accessibility issues first. Missing alt texts, poor color contrast ratios, unlabeled form elements. This has direct value for AI and it's often a quick fix.
- Tackle Performance: eliminate render-blocking scripts, switch to WebP/AVIF for images, set a decent caching policy.
- Check the SEO category for missing title tags, duplicate meta descriptions and crawlability issues.
- Use Best Practices as a checklist for technical hygiene: HTTPS, no console errors, safe external links.
Don't run Lighthouse only on your homepage. Inconsistent scores per page are a red flag: it means your technical foundation is uneven. It's like an ecosystem that blooms in some places and is barren in others. Something is off beneath the surface.
A Lighthouse score above 90 is no guarantee of a high AEO score. But a score below 70 is almost certainly holding it back. Fix the basic errors and you lift the biggest blockers in one move.
Want to know where your site stands? Run our free AEO scan at aeo-expert.nl. Or drop me a message and we'll look at it together.
Lighthouse is not just a performance tool. It's a quality audit that happens to speak the language AI models understand. The question isn't whether you should run it. The question is why you're not already doing it every week.
Frequently asked questions
Is a high Lighthouse score enough for a good AEO score?
No, and that's a trap I regularly have to steer clients away from. Lighthouse doesn't measure AEO-specific signals like llms.txt, Schema.org markup, E-E-A-T elements or robots.txt configuration for AI crawlers. A high Lighthouse score removes technical barriers. But you need additional AEO optimization to actively help AI models cite your content. Think of it as the foundation: solid, but without walls and a roof you don't have a house yet.
Which Lighthouse category has the most impact on AI visibility?
Based on our audits: Accessibility. The reason is simple. Alt texts, ARIA labels and semantic HTML are precisely the metadata AI models use to interpret page elements. Performance is the second priority, because a slow server simply gets less crawl attention. But if I had to choose where to start? Accessibility. Every time.
Should I measure Lighthouse on mobile or desktop for AEO purposes?
Both, but for AI crawlers the desktop measurement is most representative. AI crawlers don't emulate a mobile device and don't apply JavaScript throttling. The desktop score gives the most accurate picture of what a crawler sees. Mobile matters for user experience and Google ranking, but for AI readiness you look primarily at desktop.
How does your website score on AI readiness?
Get your AEO score within 30 seconds and discover what you can improve.