Choosing hosting that can handle AI crawlers
Why hosting is the most underestimated AEO component
I am going to say something hosting providers do not like to hear: most hosting packages are not built for the reality of AI crawlers.
At Kobalt we manage hosting for dozens of clients. And if there is one thing I have learned over all those years, it is that poor hosting undoes all other investments. You can have a perfect llms.txt, excellent schema.org markup and tight content structure. But if your server is slow, unreliable or geographically on the other side of the world? Then much of that effort is lost.
AI crawlers are less forgiving than humans. A human clicks away and tries again tomorrow. An AI crawler registers the slow response time, adjusts its crawl budget for your domain and comes back less deeply next session. Structurally. That is not an incident, that is a pattern.
The four factors that matter
Server response time: the first impression
Time to First Byte is the most direct measure of how quickly your server responds. For AI crawlers, this is the first hurdle. A TTFB above 800 milliseconds? That signals your server is overloaded, undersized or poorly configured.
With shared hosting and hundreds of other websites on the same server, a TTFB of 1.5 to 3 seconds is not unusual. I have seen it. Regularly.
# Measure your TTFB via curl
curl -o /dev/null -s -w "TTFB: %{time_starttransfer}s\nTotal: %{time_total}s\n" https://yourdomain.com
# Ideal result:
# TTFB: 0.180s
# Total: 0.820s
# Worrying result:
# TTFB: 1.840s
# Total: 4.230sConcurrent connections: can your server handle it?
AI crawlers do not crawl one page at a time. They send multiple simultaneous requests. If your hosting plan limits concurrent connections (and most shared plans do), those crawl requests end up in a queue. Or worse: they get rejected.
- Shared hosting: typically 10-25 concurrent connections. Not ideal.
- VPS or cloud: typically 50-200 connections, depending on configuration. Sufficient for most sites.
- Dedicated server: full control, scalable as needed.
- Managed WordPress hosting (Kinsta, WP Engine): optimized, good concurrent capacity.
- CDN layer in front: CloudFlare or Fastly significantly reduces server load.
Location: closer to the crawler, faster the response
Most major AI companies are American. Their crawlers start from US or European data centers. Server in Amsterdam, crawler in Virginia? That is 80-120 milliseconds of network latency before anything even happens. Manageable. But combine that with a slow TTFB and you quickly have 2 seconds of total delay.
For Dutch companies I recommend Amsterdam or Frankfurt as server location. Good latency for local visitors, acceptable latency for American crawlers. A CDN with US edge locations is an efficient addition.
Uptime: being there when it counts
An AI crawler visiting your site during downtime registers a timeout. Depending on the crawler, that can lead to a reduced crawl budget or a longer wait until the next crawl. Aim for 99.9% uptime and monitor it actively. UptimeRobot or Better Uptime are solid tools for that.
What I recommend after dozens of migrations
This is not an affiliate list. This is what we actually use and recommend at Kobalt, based on dozens of hosting migrations.
- Small sites (up to 10,000 visitors/month): good managed hosting or a shared VPS. Do check your TTFB after you start.
- Mid-size (10,000-100,000 visitors): dedicated VPS or cloud (Hetzner, DigitalOcean, AWS Lightsail) with a CDN in front.
- Large (100,000+): auto-scaling cloud with load balancing and robust CDN configuration.
- All levels: set a Crawl-delay for AI bots in your robots.txt. Protects your server without blocking crawlers.
- All levels: monitor your TTFB and uptime weekly. Not only when there are complaints.
Shared hosting for less than 5 euros per month is almost never suitable for serious AI visibility. The resource limitations, the high neighbor competition on the same server and the minimal support make it a constant source of frustration. Invest in your hosting. It is the foundation of everything.
I sometimes compare it to a birdhouse. You can build the most beautiful house, but if you place it where the wind has free rein and the cat can reach it, no birds will come. Location and foundation determine everything.
Frequently asked questions
Does the choice of hosting affect my Google ranking?
Yes, indirectly. Google uses TTFB and Core Web Vitals as ranking factors. Poor hosting gives a high TTFB, which leads to poor LCP, which harms your ranking. Frequent downtime leads to crawl problems with Googlebot. Good hosting is both an SEO and an AEO investment.
Do I need to set up a separate server for AI crawlers?
No. What you can do is configure your server well for multiple concurrent requests and set Crawl-delay in your robots.txt to regulate the pace. Protects your server, ensures crawlers eventually reach all content.
Is a CDN a sufficient replacement for good hosting?
A CDN improves delivery of static content and reduces latency for international visitors. But it cannot fully compensate for a slow origin server. Dynamic pages still go through to your server. The hosting itself needs to be in order.
Your hosting is not a line item on your invoice. It is the foundation of your online presence. Cutting corners on hosting means cutting corners on your reachability, for everyone who wants to find you. Human or machine.
How does your website score on AI readiness?
Get your AEO score within 30 seconds and discover what you can improve.