TECHNICAL SEO AI & AGENTS 06 Feb 2026 10 min read

Server-side rendering versus client-side for AI

Bas Vermeer
Bas Vermeer SEO/AEO Specialist

Rendering and AI: why it matters

When a visitor opens your website, the browser builds the page from HTML, CSS and JavaScript. With server-side rendering (SSR), the server sends a fully constructed HTML page to the browser. With client-side rendering (CSR), the server sends a minimal HTML skeleton and lets JavaScript build the page in the browser. For human visitors the difference is barely noticeable, but for AI crawlers it is fundamental.

Most AI crawlers do not execute JavaScript. GPTBot, PerplexityBot and ClaudeBot read the initially delivered HTML and ignore content that only appears after JavaScript execution. This means that a React, Vue or Angular application that relies entirely on client-side rendering is virtually empty for AI crawlers. All content, all structured data and all meta information loaded via JavaScript remains invisible. This is one of the most underestimated technical factors in AEO strategy.

IMPORTANT

Most AI crawlers do not execute JavaScript. Content that is only available through client-side rendering does not exist for AI models. This applies to GPTBot, PerplexityBot, ClaudeBot and most other AI crawlers.

How AI crawlers process your page

To understand why rendering is so important, you need to know how AI crawlers technically operate. The process typically works as follows.

  1. The crawler sends an HTTP GET request to your URL and receives the initially delivered HTML (the "raw" server response).
  2. The crawler parses the HTML and extracts text, meta tags, JSON-LD structured data, links and heading hierarchy.
  3. Content that only appears after executing JavaScript is not seen, unless the crawler has a rendering engine (like Googlebot).
  4. The extracted information is processed, indexed or directly used for generating answers.

Googlebot is an exception: Google has a full rendering pipeline that executes JavaScript via a headless Chrome instance. But even with Googlebot, rendering happens with a delay (the "render queue"), so SSR content is processed faster. With all known AI crawlers, SSR is the only reliable way to ensure your content is visible.

Server-side rendering: the gold standard for AI

With server-side rendering, the server builds the complete HTML page before sending it to the client. This is the traditional model used by frameworks like Laravel with Blade templates, Ruby on Rails and Django. It is also the approach that modern meta-frameworks like Next.js (React) and Nuxt (Vue) support by default.

<!-- SSR: the server delivers complete HTML -->\n<!DOCTYPE html>\n<html lang="en">\n<head>\n  <title>Server-Side Rendered Page</title>\n  <meta name="description" content="All content is directly available in the HTML." />\n  <script type="application/ld+json">\n  {\n    "@context": "https://schema.org",\n    "@type": "Article",\n    "headline": "Server-Side Rendered Page",\n    "author": { "@type": "Person", "name": "Jan de Vries" }\n  }\n  </script>\n</head>\n<body>\n  <h1>Server-Side Rendered Page</h1>\n  <p>This content is directly visible to AI crawlers\n     without JavaScript execution.</p>\n</body>\n</html>

Advantages of SSR for AI visibility

  • All content is directly available in the initially delivered HTML, regardless of whether the crawler executes JavaScript.
  • Structured data (JSON-LD) is included in the server response and is guaranteed to be parsed.
  • Meta tags (title, description, Open Graph, canonical) are directly present and do not need to be dynamically generated.
  • Faster First Contentful Paint, which is not only good for user experience but also for crawlers with a time budget.
  • Easier to debug: the HTML source code shows exactly what crawlers see.

Client-side rendering: the AI visibility pitfall

Client-side rendering has become popular through frameworks like React, Vue and Angular. The approach offers advantages for interactive web applications, but is problematic for content that needs to be indexed by AI crawlers.

<!-- CSR: the server delivers an empty skeleton -->\n<!DOCTYPE html>\n<html lang="en">\n<head>\n  <title>My App</title>\n</head>\n<body>\n  <div id="app"></div>\n  <!-- All your content is only loaded after this script runs -->\n  <script src="/js/app.bundle.js"></script>\n</body>\n</html>\n\n<!-- This is what an AI crawler sees: an empty page\n     with just a div and a script reference.\n     No text, no structured data, no meta tags. -->

The example above illustrates the core problem. An AI crawler only sees the initially delivered HTML: an empty <div id="app"> and a reference to a JavaScript bundle. All text, images, structured data and navigation that JavaScript would load are invisible. This is similar to the problem with dynamically generated Schema.org markup: if it is not in the initially delivered HTML, it does not exist for most AI crawlers.

Hybrid solutions: the best of both worlds

Fortunately, you do not have to choose between pure SSR and pure CSR. Modern frameworks offer hybrid solutions that combine server-side rendering with client-side interactivity.

Static Site Generation (SSG)

With SSG, pages are pre-generated as static HTML files during the build process. This is the fastest option for AI crawlers, because the content is directly available without any server processing. Frameworks like Next.js (getStaticProps), Nuxt (nuxt generate) and Astro support SSG by default.

Incremental Static Regeneration (ISR)

ISR combines the speed of SSG with the flexibility of SSR. Pages are statically generated but can be updated in the background after a set interval. This is ideal for content that changes regularly but does not need to be real-time.

// Next.js: different rendering strategies\n\n// 1. Server-Side Rendering (SSR)\n// Page is rendered on the server with every request\nexport async function getServerSideProps() {\n  const data = await fetchArticleData();\n  return { props: { data } };\n}\n\n// 2. Static Site Generation (SSG)\n// Page is generated once during build\nexport async function getStaticProps() {\n  const data = await fetchArticleData();\n  return { props: { data } };\n}\n\n// 3. Incremental Static Regeneration (ISR)\n// Page is statically generated and periodically updated\nexport async function getStaticProps() {\n  const data = await fetchArticleData();\n  return {\n    props: { data },\n    revalidate: 3600, // Revalidate every hour\n  };\n}

Laravel and Livewire: SSR with interactivity

Laravel with Livewire is an excellent example of a framework that combines SSR with client-side interactivity. Blade templates are rendered server-side, making all content directly available to AI crawlers. Livewire adds interactivity through AJAX calls, but the initially delivered HTML always contains the full content. This makes Laravel with Livewire one of the most AI-friendly stacks available. Combined with strong E-E-A-T signals and correct structured data, this provides a solid foundation for AI visibility.

Testing your rendering from an AI perspective

How do you know whether AI crawlers can actually see your content? There are several ways to test this.

# Test 1: View the raw HTML source code\ncurl -s https://yoursite.com/page | head -100\n\n# Test 2: Compare with a JavaScript-disabled browser\n# Chrome: Settings > Privacy > Site settings > JavaScript > Off\n\n# Test 3: Use a tool that does not execute JavaScript\ncurl -A "GPTBot/1.0" -s https://yoursite.com/page \\n  | grep -c "<p>"\n\n# If this returns 0, your content\n# depends on JavaScript rendering\n\n# Test 4: Google Rich Results Test\n# Compare "rendered HTML" with "source code"\n# in the Google Rich Results Test
  1. View the HTML source code (not "Inspect Element") via Ctrl+U or curl. This is the unrendered version that AI crawlers receive.
  2. Disable JavaScript in your browser and navigate through your site. All content that disappears is invisible to AI crawlers.
  3. Use the Google Rich Results Test and compare the source code with the rendered version. Large differences indicate JavaScript dependency.
  4. Check your structured data: is the JSON-LD block present in the raw HTML, or is it dynamically added by JavaScript?
The simplest test for AI visibility: disable JavaScript in your browser. Everything that disappears is invisible to most AI crawlers.

Key takeaways

  • Most AI crawlers do not execute JavaScript. Only content in the initially delivered HTML is guaranteed to be visible.
  • Server-side rendering (SSR) is the gold standard for AI visibility: all content, structured data and meta tags are directly available.
  • Pure client-side rendering (CSR) makes your content invisible to AI crawlers. Avoid this for content you want indexed.
  • Hybrid solutions like SSG, ISR and Laravel with Livewire combine the benefits of SSR with client-side interactivity.
  • Test your rendering by disabling JavaScript, viewing the raw HTML source code and using tools like curl to verify what crawlers see.

Frequently asked questions

Does Googlebot execute JavaScript?

Yes, Googlebot has a full rendering engine based on headless Chrome. But this rendering takes place in a separate step (the "render queue") that can take days to weeks. Server-side rendered content is therefore processed faster. Moreover, other AI crawlers (GPTBot, PerplexityBot, ClaudeBot) are less advanced and typically do not execute JavaScript.

Can I use dynamic rendering as a compromise?

Dynamic rendering (serving different content to crawlers versus users) is officially discouraged by Google and is considered cloaking by some AI crawlers. It is a technically complex and fragile compromise that is better replaced by real SSR or hybrid solutions. If you still consider dynamic rendering, make sure the content is identical for both crawlers and users.

How does Livewire technically work for AI crawlers?

Livewire renders the initial state of a component server-side as complete HTML. After the page loads, JavaScript takes over for interactive updates via AJAX. Because the first render happens server-side, AI crawlers see the full content including all text, structured data and meta tags. Later interactive changes (like form validation or live search) are not visible to crawlers, but these are typically not relevant for indexing anyway.

My React app runs on Next.js with SSR. Is that sufficient?

If you correctly configure Next.js with getServerSideProps or getStaticProps, your content is server-side rendered and visible to AI crawlers. Verify this by viewing the raw HTML source code: if your content is present there, your configuration is correct. Be aware that some components may still render client-side (useEffect, dynamic imports with ssr: false). Test each page individually.

Does rendering affect my Core Web Vitals?

Yes, significantly. Server-side rendering typically delivers better Largest Contentful Paint (LCP) scores because the content is directly in the HTML. Client-side rendering requires the browser to first download, parse and execute JavaScript before content becomes visible, which delays LCP. SSG offers the best scores because the HTML is fully static. Core Web Vitals are not a direct factor for AI citations, but Google does use them as a quality signal for AI Overviews.

The most beautiful JavaScript framework in the world is worthless if AI crawlers cannot read your content. Always choose a rendering strategy that delivers server-side HTML.

How does your website score on AI readiness?

Get your AEO score within 30 seconds and discover what you can improve.

Free scan

SHARE THIS ARTICLE

LINKEDIN X

RELATED ARTICLES