Lazy loading and AI crawlers: when it hurts your visibility
What is lazy loading and why do you use it?
I am a big fan of lazy loading. Truly. As someone who has been working on web performance for over twenty years, I have witnessed the evolution from "load everything at once and hope for the best" to smart techniques that only load what is needed. Lazy loading is one of the finest examples.
The idea is simple: images, videos and sometimes entire content blocks are only loaded when the visitor scrolls to them. Faster page, less bandwidth, better Core Web Vitals. Everyone happy.
But wait. Not everyone.
Because there is a group of "visitors" that never scrolls, never clicks and does not execute JavaScript. AI crawlers. And that group is becoming increasingly important.
GPTBot, ClaudeBot, PerplexityBot: they behave like a simple HTTP client. Fetch HTML, read it, leave. No scroll events, no IntersectionObserver, no JavaScript. Content behind lazy loading? It literally does not exist for them.
Not all lazy loading is equally harmful
I have to be honest here: I myself did not make the distinction sharply enough for months. Lazy loading on images via `loading="lazy"` is perfectly fine. The HTML is right there, including alt text and src. Only the file itself is loaded later. AI crawlers read context, not pixels. No problem.
But then there is the other variant. The one that gives me chills.
- JavaScript-driven content via fetch or XHR after page load: completely invisible to AI crawlers.
- Tabs and accordions where content only loads on click: AI bots do not click.
- Infinite scroll or "load more" buttons: AI only sees the first batch.
- Skeleton screens filled via API calls: the crawler sees an empty skeleton. Literally.
What I encounter at clients
At Kobalt we see this pattern surprisingly often. Just recently at an e-commerce client. Beautiful product pages. Extensive specifications, reviews, technical documents. Everything neatly divided across tabs.
The problem? The "Specifications" and "Reviews" tabs only load their content when you click on them. For Google this is not such a disaster in 2026, because Googlebot renders JavaScript. But AI crawlers? They see the tab labels and nothing else.
I literally told that client: "You have written fantastic content that does not exist for half the internet." That was a sobering meeting.
Practical alternatives that combine performance and visibility
Fortunately, you do not have to choose. This is not an either-or situation. It is a matter of building smart. Sometimes you just have to lay down a bunt instead of swinging for a home run.
- Always put critical content in the server-rendered HTML. Text and structured data belong there. Decorative elements and heavy images may lazy load.
- Use CSS-only accordions with `` and `
`. The content is in the HTML, crawlers read it just fine. The browser hides it visually. Elegant and effective.
- Replace JavaScript tabs with a section layout. All content on the page, CSS handles the visual structuring. Zero JavaScript, zero crawler problems.
- Must you use JavaScript lazy loading? Consider server-side rendering for the content that actually matters.
- The `
Open your terminal and do a simple curl to your own page. What you see is what AI crawlers see too. Search that output for your most important content. Not there? Then you have a problem. It is that simple.
Finding the balance
Look, I am not going to tell you to throw out all your lazy loading. That would be hypocritical from someone who once spent an entire evening shaving 12 milliseconds off a TTFB. (My wife was less impressed than I was.)
But many websites have lost the balance. They optimize so aggressively for speed that they forget there is a growing group of bots that simply cannot read their content. And those bots increasingly determine whether your expertise is visible in AI answers.
A well-built website is like an ecosystem: everything is connected. Performance and visibility are not opposites. They reinforce each other, provided you make the right technical choices.
Performance and AI visibility are not opposites. With the right technical choices you achieve both. But you have to make conscious decisions, because the default approach of many frameworks puts content behind JavaScript walls. Want to know how your site scores? Our free AEO scan will show you.
Frequently asked questions
Does lazy loading affect my regular Google rankings?
Less than you think, because Googlebot executes JavaScript and crawls rendered HTML. But it costs Google more resources, which can indirectly affect your crawl budget. For AI crawlers it is a much bigger problem. My advice: always put critical content in the static HTML. Then you are covered for everything.
Is the `loading="lazy"` attribute on images a problem?
No. Safe. The alt text, filename and structural context are right there in the HTML. The only thing loaded later is the actual file. AI crawlers do not read pixels, so it does not matter.
My CMS generates JavaScript lazy loading automatically. What now?
Check whether your CMS can enable server-side rendering for critical content. In WordPress there are plugins that convert JavaScript output to static HTML. Not possible? Consider a headless approach with a static HTML layer on top of your CMS. More work, but it solves the problem structurally. At Kobalt we regularly help with this.
How does your website score on AI readiness?
Get your AEO score within 30 seconds and discover what you can improve.