llms.txt -The File That Separates 63 from 34
Tidio has a 251-line llms.txt. Crisp has zero. The score gap: +29 points. This single file tells AI assistants exactly what your site does -and without it, they're guessing.
Part of the AEO scoring framework - the current 48 criteria that measure how ready a website is for AI-driven search across ChatGPT, Claude, Perplexity, and Google AIO.
Quick Answer
llms.txt is a plain-text file at your domain root that hands AI assistants a cheat sheet about your site -what you do, what content you offer, where to find it. We've seen sites jump 10+ points just by adding one. Place it at /llms.txt with your business description, key content areas, and URLs. Takes 20 minutes.
Audit Note
In our audits, we've measured llms.txt on live sites, we've compared implementations, and we've audited the gaps that keep scores low.
What is an llms.txt file and why does my website need one?
llms.txt is a plain-text file sitting at your domain root (e.g., example.com/llms.txt) that gives large language models a...
How do I create an llms.txt file for my site?
Here's what happens without llms.txt.
Does llms.txt actually improve AI visibility scores?
Create a file called llms.txt in your site's root directory.
Summarize This Article With AI
Open this article in your preferred AI engine for an instant summary and analysis.
Sites with llms.txt consistently score 20-30 points higher
Before & After
Before - No llms.txt file
# No file at /llms.txt # AI has to guess what your site does # from scattered HTML pages and marketing copy
After - Structured llms.txt at domain root
# Example Business Name > Brief description of what the business does. ## Products & Services - Live Chat: Real-time support widget (/live-chat) - Help Desk: Ticket management system (/help-desk) ## Key Content - Blog: Expert articles (/blog) - FAQ: Common questions answered (/faq)
What Is llms.txt and How Does It Work?
llms.txt is a plain-text file sitting at your domain root (e.g., example.com/llms.txt) that gives large language models a structured summary of your website.
Think of it as robots.txt's smarter cousin. robots.txt tells crawlers which pages to access. llms.txt tells AI systems what your site actually is, what it offers, and how to navigate it.
Tidio's llms.txt runs 251 lines. It lists every product, every feature, every integration. When ChatGPT needs to answer "What live chat tools support Shopify?" -Tidio's file hands over the answer on a silver platter. Crisp? No llms.txt at all. Score: 34.
The format is dead simple: - A brief business description - Key content areas with URLs - Product/service categories - Contact info and location - Links to important resources (FAQ, docs, API references)
Why Do AI Assistants Struggle Without llms.txt?
Here's what happens without llms.txt. Someone asks ChatGPT about your industry. The AI has to piece together your site's purpose from scattered HTML pages, marketing fluff, and cookie banners. It gets confused. It skips you.
Put on ChatGPT's glasses for a second. It's scanning hundreds of sites for an answer. One site has a clear, structured summary of everything it does. Another site forces the AI to parse fifty pages of JavaScript-heavy marketing copy. Which one gets cited?
We've audited sites across multiple verticals. The pattern is unmistakable: - Sites with llms.txt get their business described accurately by AI -because they wrote the description themselves - Sites without it get described wrong, or worse, not described at all - The file works from day one -unlike SEO authority that takes months to build - You control the narrative. You define how AI systems talk about your business.
How Do You Build an llms.txt File?
Create a file called llms.txt in your site's root directory. Here's the skeleton:
``` # Example Business Name
> Brief one-line description of what the business does.
## About Two to three sentences about the business, its expertise, and what makes it unique.
## Products & Services - Category 1: Description (URL: /category-1) - Category 2: Description (URL: /category-2)
## Key Content - Blog: Expert articles about [topic] (/blog) - FAQ: Common questions answered (/faq) - Guides: In-depth resources (/guides)
## Contact - Location: City, State - Email: hello@example.com - Website: example.com ```
Want to go deeper? Create llms-full.txt with your complete content inventory, taxonomy, and technical specs. Our own llms-full.txt runs the full content map -every article, every audit, every tool.
For Shopify: upload via the theme editor (Settings > Files) or host on your CDN. For WordPress: drop it in the root directory or use a plugin. For Next.js or static sites: add it to your public folder.
Start here: Open a text editor right now and write 10 lines describing your business. That's your llms.txt draft.
What Are the Most Common llms.txt Mistakes?
We see the same mistakes in audit after audit:
Writing a novel instead of a summary. AI systems want concise, structured facts -not your brand manifesto. Tidio's 251 lines work because every line is specific product information, not fluff.
Never updating it. Your llms.txt from six months ago doesn't mention your three new features. AI is now giving outdated info about your business.
Writing marketing copy. "We're the world's leading provider of synergistic solutions..." -AI systems want facts. What do you do. Where are you. What do you sell. Period.
Forgetting URLs. The file should help AI navigate to specific pages. A description without links is a map without roads.
Putting it in a subdirectory. It must live at /llms.txt. Not /assets/llms.txt. Not /docs/llms.txt. The root.
Score Impact in Practice
The llms.txt criterion carries 3% weight in the Technical Plumbing tier of the AEO scoring model. That might sound small, but the downstream effects are outsized because llms.txt shapes how AI understands every other page on your site.
Sites with a well-structured llms.txt consistently score 7-8/10 on this criterion. Sites without one score 0 - there's no partial credit. You either have the file or you don't. In the live chat vertical, Tidio's 251-line llms.txt contributes to its 63/100 overall score. Crisp, with no llms.txt at all, sits at 34/100. That 29-point gap isn't caused by llms.txt alone, but when we break down the scoring, the file's presence correlates with higher scores across multiple criteria because it helps AI contextualize everything else on the site.
Across the 500+ sites we've audited, fewer than 15% have an llms.txt file. The adoption rate among Y Combinator startups is higher - roughly 25% - but still leaves massive ground unclaimed. Early adopters get disproportionate benefit because AI systems are actively looking for this file and rewarding the sites that provide it.
How AI Engines Evaluate This
Each AI engine processes llms.txt slightly differently, but the core behavior is the same: they check your domain root for the file and use it as a primary context source.
ChatGPT (GPTBot) fetches /llms.txt early in its crawl cycle. When it finds a structured summary with clear product descriptions and URLs, it uses those descriptions almost verbatim when answering questions about your business. Without the file, GPTBot has to infer your business purpose from scattered page content - and it frequently gets it wrong or skips you entirely.
Claude (Anthropic's crawler) treats llms.txt as a trust signal. A site that takes the time to create a machine-readable summary demonstrates intentionality about AI engagement. Claude's retrieval system uses the file to build an entity model - what the business does, what it offers, where its key content lives. This entity model feeds directly into citation decisions.
Perplexity is the most aggressive consumer of llms.txt. When Perplexity builds its source list for an answer, it checks llms.txt to quickly determine whether a site is relevant to the query. A site with a clear, structured description gets evaluated faster and more accurately. Sites without the file require Perplexity to scan multiple pages to determine relevance - and with its speed-optimized architecture, it often moves on before completing that scan.
Google's AI Overviews don't directly consume llms.txt the way purpose-built AI crawlers do, but the file's content reinforces the same signals Google already parses from structured data and meta descriptions.
External Resources
Key Takeaways
- Create a plain-text llms.txt file at your domain root with your business description, key content areas, and URLs.
- Keep it factual and structured - AI wants specific product info, not marketing fluff.
- Update your llms.txt whenever you launch new features or content areas so AI never gives outdated info.
- Include direct URLs to important pages - a description without links is a map without roads.
How does your site score on this criterion?
Get a free AEO audit and see where you stand across all 34 criteria.