llms.txt: the robots.txt for AI models
What is llms.txt?
The llms.txt file is a relatively new concept that is gaining increasing traction in the web development community. Just as robots.txt provides instructions to web crawlers, llms.txt gives specific instructions to large language models (LLMs) about the content and structure of your website. The file is placed in the root of your domain and provides AI models with a structured overview of what your site has to offer.
The standard was proposed by Jeremy Howard (founder of fast.ai) and is now supported by multiple AI platforms. The idea is simple yet powerful: give AI models a concise, machine-readable summary of your website so they can better index and use your content as a source.
Not yet familiar with the broader concept of AI optimization? Start with our introduction to AEO for the full context.
The llms.txt format
An llms.txt file follows a specific markdown-like format. It begins with your organization name as an H1 heading, followed by a short description. Then sections follow that categorize your most important content.
# Kobalt Digital
> Kobalt Digital is an AEO and SEO consultancy based in Amsterdam.
> We help businesses become visible in AI-generated answers.
## Docs
- [AEO Strategy Guide](/docs/aeo-strategy): Complete guide to Answer Engine Optimization
- [Schema.org Implementation](/docs/schema-org): Guide for structured data markup
- [AI-Ready Checklist](/docs/checklist): Step-by-step checklist for AI readiness
## Blog
- [What is AEO?](/blog/what-is-aeo): Introduction to Answer Engine Optimization
- [llms.txt Explained](/blog/llms-txt): Everything about the llms.txt file
## API
- [Scanner API](/api/docs): Documentation for the AEO Scanner API
## Optional
- [About Us](/about): Information about the team
- [Contact](/contact): Get in touchSections in detail
The file has several sections, each serving a specific purpose. The structure is designed to help AI models quickly find the right content.
- The H1 heading and blockquote provide the identity and core description of your organization.
- The "Docs" section contains your most important documentation and guides.
- The "Blog" section points to your most relevant blog posts.
- The "API" section is relevant if you offer a public API.
- The "Optional" section contains links that are useful but not essential.
Link syntax
Each link follows the format: `- [Title](URL): Description`. The description after the colon is crucial: it helps AI models understand what the page is about without having to load it first. Keep descriptions short but informative, ideally between 10 and 30 words.
Dive deeper: Robots.txt for AI: more than crawl instructions | Schema.org markup for AI | MCP Servers for AI agents
Placing and validating llms.txt
The file must be accessible at `https://yourdomain.com/llms.txt`. Make sure the file is served with the correct content-type header.
- Create the file in the root of your web server (public directory in Laravel).
- Ensure the content-type is `text/plain; charset=utf-8`.
- Verify the file is publicly accessible without authentication.
- Validate the syntax: each section starts with ## and links follow the markdown format.
- Test the URL in your browser: go to yourdomain.com/llms.txt and check if the content displays correctly.
Example: adding llms.txt in Laravel
In a Laravel project, you place the llms.txt file in the `public/` directory. Alternatively, you can create a route that dynamically generates the file based on your content.
// routes/web.php
Route::get(\'llms.txt\', function () {
$content = "# Your Company\n\n";
$content .= "> Description of your organization.\n\n";
$content .= "## Blog\n\n";
$posts = App\Models\Post::published()->get();
foreach ($posts as $post) {
$content .= "- [{$post->title}]({$post->url}): {$post->excerpt}\n";
}
return response($content)
->header(\'Content-Type\', \'text/plain; charset=utf-8\');
});Add a reference to your llms.txt in your robots.txt with an Llms-Txt rule: Llms-Txt: https://yourdomain.com/llms.txt. This helps AI crawlers discover the file faster.
llms-full.txt for extended content
In addition to the standard llms.txt file, you can also create an llms-full.txt. This file contains more detailed information and can include longer descriptions, full texts or extensive documentation. Where llms.txt is a concise overview, llms-full.txt is the complete encyclopedia of your website.
This is especially useful for organizations with extensive documentation, technical manuals or knowledge bases. AI models with large context windows can use this extended version for a deeper understanding of your content.
When to choose llms.txt versus llms-full.txt?
Use the standard llms.txt as a compact overview of your 20 to 50 most important pages. This is what AI models load first to get a picture of your website. Use llms-full.txt for the full catalog with detailed descriptions. AI models with a large context window (100K+ tokens) can load this version for in-depth analysis. A good rule of thumb: if your llms.txt exceeds 4,000 words, split it into a concise version and a full version.
The relationship between llms.txt and robots.txt
It is important to understand that llms.txt and robots.txt are complementary files. Robots.txt tells AI bots which pages they may and may not visit. Llms.txt tells them which pages are most relevant and provides context about your website as a whole. Together they form a complete instruction set for AI crawlers.
Think of robots.txt as the security guard at the entrance who decides who gets in, and llms.txt as the receptionist who tells visitors where to find the information they are looking for.
Best practices and common mistakes
- Keep your llms.txt file up to date. Remove links to pages that no longer exist and add new important content.
- Use descriptive titles and descriptions in both Dutch and English if you have a multilingual site.
- Limit the standard llms.txt to your 20 to 50 most important pages. Use llms-full.txt for a complete overview.
- Avoid including pages behind authentication: AI models cannot access those anyway.
- Regularly test whether all links in the file still work. Broken links reduce the trust AI models have in your content.
- Also include your Schema.org documentation and API docs if you have them: this increases your technical credibility.
Who supports llms.txt already?
An increasing number of AI platforms recognize llms.txt as a source. Perplexity actively loads llms.txt files when indexing websites. Claude (Anthropic) uses it as one of the signals for content discovery. OpenAI's crawlers are familiar with the format. As more platforms adopt the standard, having a well-maintained llms.txt becomes increasingly valuable.
Key takeaways
- llms.txt is a standardized file that tells AI models what your website has to offer, similar to how robots.txt instructs web crawlers.
- The format follows a markdown-like syntax with sections for documentation, blog, API and optional links.
- Place the file at yourdomain.com/llms.txt with content-type text/plain and reference it from your robots.txt.
- Use llms.txt for a concise overview (20 to 50 pages) and llms-full.txt for a complete catalog.
- Keep the file current, test links regularly and combine it with a well-configured robots.txt.
Frequently asked questions
Is llms.txt required for AEO?
No, llms.txt is not required, but it is a strongly recommended best practice. Without llms.txt, AI models have to explore your website themselves to understand what you have to offer. With llms.txt, you give them a structured roadmap that significantly simplifies their work.
How often should I update my llms.txt?
Update your llms.txt every time you add, change or remove important content. In practice, this means at least monthly for most websites. If you have a dynamic site with frequent new content, consider automatically generating the file from your CMS.
Can I use llms.txt on a WordPress site?
Yes, you can use llms.txt on any website. For WordPress, you can manually upload the file to the root of your installation, or use a plugin that automatically generates it based on your content. Several WordPress plugins that support llms.txt are now available.
What is the difference between llms.txt and a sitemap.xml?
A sitemap.xml is intended for search engine crawlers and contains all indexable URLs on your site. llms.txt is intended for AI models and contains a curated overview of your most important pages with descriptions. Where a sitemap aims to be complete, llms.txt aims to be selective and informative. They complement each other.
Does not having llms.txt hurt my SEO?
The absence of llms.txt has no direct negative impact on your traditional SEO rankings. It is specifically a tool for AI visibility. However, without llms.txt you risk that AI models understand your website less well and cite it less often as a source.
A well-maintained llms.txt file is like a business card for AI models: it tells them at a glance who you are and what you have to offer.
How does your website score on AI readiness?
Get your AEO score within 30 seconds and discover what you can improve.