Core Web Vitals and AI: why speed matters for citations
Why Core Web Vitals matter for AI citations
When AI models like ChatGPT, Perplexity and Gemini fetch content from the web, they have a limited time budget per page. A page that loads slowly or whose DOM only stabilizes after several seconds delivers less usable data within that time window. The consequence is that slow websites are structurally cited less often in AI-generated answers. Core Web Vitals, the performance metrics Google introduced in 2021, are therefore no longer solely an SEO factor. They have become a direct prerequisite for AI visibility.
This closely aligns with the broader technical requirements AI models place on websites. In our article about AEO and why it matters, we discuss how technical machine-readability forms one of the three pillars of effective AI optimization. Core Web Vitals are the foundation of that technical pillar.
Perplexity fetches pages in real-time for every search query. A page that takes longer than 3 seconds to serve usable content is often skipped in favor of faster alternatives.
The three Core Web Vitals explained
Google measures the user experience of a web page using three core metrics. Each metric tells something different about how pleasant (or unpleasant) it is to visit a page, and each has its own impact on how AI crawlers process your content.
Largest Contentful Paint (LCP)
LCP measures how long it takes for the largest visible element on the page to load. This could be a hero image, a large heading or a video frame. Google considers an LCP below 2.5 seconds as good, between 2.5 and 4 seconds as moderate, and above 4 seconds as poor.
For AI crawlers, LCP is particularly relevant. When a crawler like PerplexityBot or GPTBot requests your page, it does not wait indefinitely for the complete render. If your primary content only becomes available after 4 seconds, the crawler may miss the most important information. The result: your page is visited but not effectively indexed.
Interaction to Next Paint (INP)
INP replaced First Input Delay (FID) in March 2024 and measures how responsively a page reacts to user interactions. An INP below 200 milliseconds is good. Although AI crawlers do not click or scroll, INP is indirectly relevant. A high INP often indicates heavy JavaScript load that also delays the initial DOM construction, which crawlers do notice.
Cumulative Layout Shift (CLS)
CLS measures the visual stability of a page: how much elements shift during loading. A CLS below 0.1 is good. For AI crawlers parsing the DOM, a high CLS can mean that content elements only reach their final position late. Crawlers that parse the page at an early moment may then see an incomplete or distorted version of your content.
# Ideal Core Web Vitals scores for AI visibility
# Measure with Google PageSpeed Insights or Lighthouse
LCP (Largest Contentful Paint): < 2.5s (good)
INP (Interaction to Next Paint): < 200ms (good)
CLS (Cumulative Layout Shift): < 0.1 (good)
# For maximum AI crawler compatibility:
Time to First Byte (TTFB): < 800ms
First Contentful Paint (FCP): < 1.8s
Server-side rendered HTML: essentialHow AI crawlers experience speed
It is important to understand that AI crawlers do not have the same experience as a human visitor with a browser. Most AI crawlers request the raw HTML without executing JavaScript. This means that client-side rendered content can be invisible to them, regardless of how fast your JavaScript runs.
This directly relates to the problem we discuss in our article about robots.txt for AI: even when you allow AI crawlers to visit your site, they must also actually receive usable content. A server that quickly returns a fully rendered HTML document scores better than a server that sends an empty HTML skeleton that is only populated via JavaScript.
- GPTBot and ClaudeBot typically do not execute JavaScript. They only process the initial HTML response.
- PerplexityBot fetches pages in real-time and has a strict time budget per request.
- Googlebot (which also feeds Gemini) does execute JavaScript, but with a delay. Server-side rendering remains more advantageous.
- All AI crawlers benefit from a low TTFB (Time to First Byte), as this reduces the wait time until the first usable data.
Concrete steps to improve your Core Web Vitals
Improving Core Web Vitals requires a systematic approach. Below are the most impactful optimizations, ordered by expected effect.
Improving LCP
- Optimize your server response time. Use caching (Redis, Varnish or a CDN) to get TTFB below 800ms.
- Preload the LCP element. If it is an image, use a preload link in the head of your document.
- Serve images in modern formats (WebP or AVIF) and size them appropriately for the viewport.
- Eliminate render-blocking CSS and JavaScript. Load critical CSS inline and defer non-essential JavaScript.
- Consider server-side rendering (SSR) instead of client-side rendering, so the HTML is already complete when crawlers request your page.
<!-- Preload LCP image -->\n<link rel="preload" as="image" href="/images/hero.webp" type="image/webp" fetchpriority="high" />\n\n<!-- Load critical CSS inline -->\n<style>\n .hero { width: 100%; height: auto; }\n h1 { font-size: 2.5rem; line-height: 1.2; }\n</style>\n\n<!-- Defer non-essential JavaScript -->\n<script src="/js/analytics.js" defer></script>\n<script src="/js/chat-widget.js" async></script>Improving CLS
- Always specify explicit dimensions for images and videos (width and height attributes).
- Reserve space for ads and embeds with CSS aspect-ratio or min-height.
- Load web fonts with font-display: swap and preload the most important font files.
- Avoid dynamically injecting content above existing elements.
<!-- Explicit dimensions prevent layout shift -->\n<img src="/images/article-hero.webp"\n width="1200" height="630"\n alt="Illustration of Core Web Vitals metrics"\n loading="eager"\n decoding="async" />\n\n<!-- CSS aspect-ratio for embeds -->\n<style>\n .video-embed {\n aspect-ratio: 16 / 9;\n width: 100%;\n background: #f0f0f0;\n }\n</style>Dive deeper: Schema.org markup for AI | HTTPS and HSTS as trust signals | Canonical URLs and duplicate prevention
Server-side rendering versus client-side rendering
The choice between server-side rendering (SSR) and client-side rendering (CSR) has enormous impact on how AI crawlers process your content. With SSR, the complete HTML is generated on the server and sent as a complete document to the client. With CSR, the client receives a minimal HTML skeleton that is only populated with content via JavaScript.
For AI crawlers that do not execute JavaScript, the difference is dramatic. An SSR page delivers all content immediately, while a CSR page is effectively empty. Even for crawlers that do support JavaScript (like Googlebot), SSR is more advantageous because it drastically reduces processing time.
A website that serves complete, server-rendered HTML in 1.5 seconds is successfully indexed by AI crawlers up to 3x more often than a comparable SPA requiring 4 seconds of JavaScript rendering.
Measuring and monitoring Core Web Vitals
You can measure your Core Web Vitals in several ways. Each measurement tool offers unique insights.
- Google PageSpeed Insights: combines field data (real users via CrUX) with lab data (simulated Lighthouse test). The most complete source for an initial analysis.
- Google Search Console: shows Core Web Vitals at page level for your entire site. Ideal for identifying structural problems.
- Lighthouse (in Chrome DevTools): provides detailed recommendations per page. Run this in an incognito window for unbiased results.
- Web Vitals JavaScript Library: measures Core Web Vitals in the browsers of your real visitors. Essential for continuous monitoring.
// Measuring Core Web Vitals with the web-vitals library
import { onLCP, onINP, onCLS } from 'web-vitals';
function sendToAnalytics(metric) {
const body = JSON.stringify({
name: metric.name,
value: metric.value,
rating: metric.rating, // 'good', 'needs-improvement', or 'poor'
delta: metric.delta,
id: metric.id,
});
navigator.sendBeacon('/api/vitals', body);
}
onLCP(sendToAnalytics);
onINP(sendToAnalytics);
onCLS(sendToAnalytics);Key takeaways
- Core Web Vitals (LCP, INP, CLS) directly influence how effectively AI crawlers can fetch and process your content.
- An LCP above 4 seconds means AI crawlers may miss your most important content due to their limited time budget.
- Server-side rendering is essential for AI visibility, as most AI crawlers do not execute JavaScript.
- Invest in a low TTFB (below 800ms) and preload critical elements for the fastest possible content delivery.
- Continuously measure your Core Web Vitals with Google Search Console and the web-vitals library to detect regressions early.
Frequently asked questions
Do AI models actually measure the speed of my website?
AI models do not explicitly measure your speed the way Google PageSpeed Insights does, but they experience its effects. Perplexity fetches pages in real-time and has a timeout per request. GPTBot and ClaudeBot crawl periodically and process the raw HTML response. If that response arrives slowly or is incomplete due to heavy JavaScript dependencies, it results in less usable data and ultimately fewer citations.
Is LCP the most important metric for AI visibility?
For AI crawlers, LCP combined with TTFB is the most relevant metric, as it directly determines how quickly usable content becomes available. CLS is less relevant for crawlers that do not perform visual rendering, but it often points to underlying problems that also slow down HTML delivery. INP is the least directly relevant for AI crawlers, but indicates heavy JavaScript load that can indirectly slow page loading.
How fast should my pages load for AI crawlers?
Aim for a TTFB below 800 milliseconds and a fully rendered HTML document within 2 seconds. Perplexity has the strictest time budget and skips pages that take longer than 3 to 5 seconds. GPTBot and ClaudeBot are slightly more tolerant, but slow pages are also crawled less frequently and less completely by them.
Does a CDN help with AI visibility?
Yes, a Content Delivery Network helps significantly. CDNs lower your TTFB by serving content from a server close to the crawler. Most AI crawlers operate from data centers in the US and Europe. A CDN with edge servers in these regions considerably shortens response time. Additionally, CDNs provide caching that reduces the load on your origin server during intensive crawling.
Can I improve Core Web Vitals without redesigning my website?
Many improvements are possible without visual changes. Compressing images to WebP format, adding a caching layer, deferring render-blocking scripts and enabling a CDN are all backend optimizations that have no impact on the design. Only with fundamental architecture problems (such as a fully client-side rendered SPA) may a larger overhaul be necessary.
Speed is no longer a luxury. For AI crawlers, it is a hard requirement to be able to process your content at all.
How does your website score on AI readiness?
Get your AEO score within 30 seconds and discover what you can improve.