AEO & AI SEO 4 min read

llms.txt

A standard file that informs LLMs about the content and structure of a website.

Bas Vermeer
Bas Vermeer SEO/AEO Specialist

llms.txt is a proposed standard file (similar to robots.txt — bibliotheekterm) placed in a website's root to inform large language models about available content. It provides a structured overview of the website that AI models can use to understand the site.

What does llms.txt contain?

A llms.txt typically contains: a description of the website or organization, the main pages and sections, contact information, and instructions for how AI models may use the content. The format is still evolving but is being adopted more widely.

llms.txt and AEO

Offering a llms.txt is a proactive signal that your website is AI-ready. It gives LLMs direct context without having to crawl your entire site. Together with robots.txt and Schema.org — bibliotheekterm, it forms a complete communication layer with AI systems.

Example llms.txt file

# Company Name
> Short description of the organization and what the website offers.

## Documentation
- [API Documentation](https://example.com/docs/api): Complete REST API reference
- [Guides](https://example.com/docs/guides): Step-by-step implementation guides
- [FAQ](https://example.com/faq): Frequently asked questions and answers

## Products and Services
- [Product A](https://example.com/products/a): Description of product A
- [Product B](https://example.com/products/b): Description of product B
- [Pricing](https://example.com/pricing): Overview of pricing plans

## Company Information
- [About Us](https://example.com/about): Mission, team, and history
- [Contact](https://example.com/contact): Contact details and opening hours
- [Blog](https://example.com/blog): Latest articles and insights

## Optional
- [Technical Specs](https://example.com/docs/specs): Detailed technical documentation
- [Case Studies](https://example.com/cases): Practical examples and results

Comparison: llms.txt vs robots.txt vs sitemap.xml

Featurellms.txtrobots.txtsitemap — bibliotheekterm.xml
PurposeGive AI models context about your siteTell crawlers what they may/may not accessShow search engines all indexable URLs
AudienceLLMs and AI agentsAll web robots and crawlersSearch engine crawlers
FormatMarkdown with links and descriptionsPlain text with User-agent/Disallow rulesXML with URL elements
Location/llms.txt (root)/robots.txt (root)/sitemap.xml (root, or via robots.txt)
ContentDescriptions, links to key pages, contextAccess rules per crawler/pathList of URLs with metadata (lastmod, priority)
Binding?No, informationalConvention, not legally bindingNo, suggestion to crawlers
StatusProposed standard, growing adoptionIndustry standard since 1994Industry standard (sitemaps.org protocol)

What does our scanner check?

The scanner checks whether your website offers an llms.txt file at the correct location (/llms.txt), whether the file is valid and follows the expected structure, and whether it contains relevant content. This is part of the AEO — bibliotheekterm score.

Frequently asked questions

Is llms.txt already an official standard?

No, llms.txt is a proposed standard introduced in 2024. The format is actively being developed and adopted more widely by organizations. While there is no formal RFC or W3C standard yet, consensus on the format is growing rapidly.

What is the difference between llms.txt and llms-full.txt?

The llms.txt file contains a concise overview with links to key pages. The optional llms-full.txt file contains the full content of those pages inline, so an LLM can read everything at once without making additional requests. This is useful for smaller sites where all relevant content fits in a single file.

Do I need to describe my entire website in llms.txt?

No. Focus on the pages most relevant for AI interaction: your core products/services, documentation, frequently asked questions, and contact information. Think of it as a curated guide for AI models, not a complete sitemap.

How do I test if my llms.txt works correctly?

Visit https://yourdomain.com/llms.txt in your browser to check if the file is accessible. Verify the Markdown format is correct, that links work, and that descriptions are meaningful. Use our scanner to automatically check if your llms.txt meets the expected structure.

Do AI models respect the instructions in llms.txt?

llms.txt is primarily informational, not restrictive. It's meant to give AI models context, not to restrict access (that's what robots.txt is for). AI models use the information in llms.txt to better understand your site and generate more relevant answers.

RELATED TERMS

RELATED SCANNER CHECKS

llms.txt present
llms.txt valid

RELATED ARTICLES

Bas Vermeer
Bas Vermeer

SEO/AEO Specialist

My career started by manually combing through server log files. I wanted to understand how Googlebot crawls websites. That fascination with the technical side of discoverability? Never faded. At Koba...