AEO STRATEGY TECHNICAL SEO 14 Apr 2026 6 min read

Downtime costs you AI citations: why monitoring is essential

Reinier Sierag
Reinier Sierag Founder Kobalt
Downtime costs you AI citations: why monitoring is essential — AEO Strategy

AI crawlers are not patient visitors

I'm going to make this a bit uncomfortable. Because I know most business owners see monitoring as something for the IT department. Something technical. Something boring. Something that "the hosting provider handles."

Well, let me tell you about the time a client of ours had four hours of downtime due to a server migration. Planned, properly announced, everything by the book. We just hadn't considered that GPTBot would visit during that exact time window.

It took three weeks for citation levels to recover. Three weeks. For four hours of downtime.

That's like slamming the front door shut at the exact moment there's a scout in the audience. Maybe they'll come back. But probably not soon.

REAL WORLD

At a Kobalt client in professional services, we saw a direct correlation between that planned migration downtime of four hours and a dip in AI citations. Citation levels didn't recover until three weeks later. Four hours of downtime, three weeks of recovery. That's the bill.

How AI crawlers work (and why they don't wait)

A human visitor who sees an error page might come back tomorrow. An AI crawler? Forget it. GPTBot, ClaudeBot and PerplexityBot operate on schedules. They visit your site at fixed intervals, depending on how often your content changes and how popular your domain is.

If your server returns a 503 at that moment, the crawler logs a failure and skips your page.

How long before that crawler returns? For popular domains: days. For smaller sites: weeks. Sometimes a month.

Every downtime episode is a window that closes. And you don't even know when it was open.

What good monitoring actually looks like

I've been managing hosting infrastructure for over twenty years now. And I notice I need to address two things here: monitoring as a technical practice, and monitoring as business protection. It's both.

Here's what I consider the minimum:

  • Check interval of at most 1 minute for your homepage and critical pages. Not 5 minutes. Not 15 minutes. One minute.
  • Alerting via multiple channels: SMS or WhatsApp for critical alerts, email for warnings. Because if you only use email, you'll read it three hours later.
  • Monitoring from multiple locations: a problem in Amsterdam is not the same as a global problem.
  • HTTP status code monitoring: not just ping, but also checking for 200 OK versus redirect chains that grow too long.
  • Response time monitoring: a page that loads in 8 seconds is functionally the same as offline for a crawler.
  • SSL certificate monitoring: an expired certificate kicks out AI crawlers just as hard as a 503. And yes, I've experienced this. More than once.

Tools I use (and why)

The market is large, but these are the tools I actually use. Not because anyone pays me to mention them, but because they work:

  1. UptimeRobot (free tier): 5-minute check interval, sufficient for small sites. Easy to set up, reliable enough to start with.
  2. Better Stack: 30-second check interval, multiple locations, Slack and PagerDuty integration. My personal favorite for professional environments.
  3. Grafana Cloud with Synthetic Monitoring: for teams already in the Grafana stack. Powerful but requires technical setup.
  4. Cloudflare Health Checks: if you already use Cloudflare, the built-in checks are a logical first step at no extra cost.
TIP

Also set up a monitor on your robots.txt and llms.txt files. If these disappear or return a 404 after a server update, it directly impacts how AI crawlers approach your site. And the annoying part: you won't notice it in your normal analytics. Only weeks later, when your citations dry up.

What your hosting contract needs to say

Many companies choose hosting on price. Understandable. But with AEO optimization, uptime becomes a competitive advantage. Check your contract for these points:

  • 99.9% uptime SLA is the absolute minimum. This equals approximately 8.7 hours of downtime per year.
  • 99.95% or higher is better. The difference sounds small, but with intensive AI crawling every episode counts.
  • Planned maintenance windows: make sure your provider informs you in time. Then you can at least monitor whether the downtime falls within the window.
  • Compensation for SLA breach: a provider that pays for excessive downtime has an incentive to take it seriously.

Cheap shared hosting with a vague "uptime guarantee" and no SLA is not suitable for a site that's serious about AI visibility. That's not a sales pitch. That's mathematics.

At Kobalt we help clients choose hosting that fits their ambitions. Not always the most expensive, but always the right one.

Four hours of downtime at the wrong moment costs you more than four weeks of AEO optimization gains. Monitoring is not overhead. It's protection for your investment. And if you think differently, call me after your first AI citation dip.

Frequently asked questions

How often do AI models crawl my website?

This isn't publicly documented and differs per crawler and per site. Based on server log analysis at Kobalt clients, we see that popular domains are crawled multiple times per week by GPTBot or ClaudeBot. Smaller sites? Sometimes only once every two to four weeks. You can check the frequency in your server access logs by filtering on AI user agents. And trust me: it's an eye-opener when you do it for the first time.

Does a CDN help avoid downtime for AI crawlers?

Yes, absolutely. A CDN like Cloudflare or Fastly acts as a buffer. If your origin server is temporarily unreachable, a CDN can continue serving cached responses. Configure your CDN to serve cached pages when the origin returns a 503. This is called "stale-while-revalidate" or "serve stale on error." It's not bulletproof (a DDoS can also affect the CDN), but it reduces the risk enormously.

What do I do when I have planned maintenance?

Schedule it during off-peak hours for your target audience and outside periods of intensive AI crawling. Based on log analysis, we see a peak in bot traffic at most sites in the early morning hours (2:00-6:00 CET). Weekends are usually safest, and keep the duration to the absolute minimum. Sometimes you just have to lay down a bunt instead of swinging for a home run: short, controlled, minimal damage.

How does your website score on AI readiness?

Get your AEO score within 30 seconds and discover what you can improve.

Free scan

SHARE THIS ARTICLE

LINKEDIN X

RELATED ARTICLES