Reduce hallucinations in Cursor, Claude, and other LLMs
Generate an llms.txt with accompanied markdown content from multiple sources, to help agents access everything easily.
VIEW ON GITHUBCheck if your llms.txt file follows the official specification from llmstxt.org
| RANK | SOURCE | TOKENS | VOLUME | USERS | INSTALL |
|---|
llms.txt is a standardized file format (similar to robots.txt) that helps AI agents and LLMs efficiently navigate and understand your website's content. It provides a structured table of contents with links to markdown or plain text versions of your pages.
As AI traffic grows exponentially, traditional web scraping becomes inefficient and costly. llms.txt allows you to serve AI-friendly content directly, reducing latency and bandwidth while improving the quality of information AI agents can retrieve from your site.
The llms.txt MCP (Model Context Protocol) server turns any valid llms.txt file into a structured resource that AI assistants like Claude can directly access. This means developers can reference documentation, APIs, or content from any llms.txt-enabled website without hallucination or manual copying.
While Context7 uses vector search to find relevant content, llms.txt MCP provides structured navigation through a table of contents. This allows for more precise retrieval and reasoning over content organization, often more efficiently than embedding-based search for well-documented resources.
The Create tool is perfect for website owners whose CMS doesn't support markdown output. It generates an llms.txt file AND converts all your pages to markdown automatically using the Parallel Extract API, saving you from manual conversion work.
Your llms.txt must follow the official specification. Key requirements include:
yoursite.com/llms.txtCommon validation failures include:
Use our CHECK tool to identify specific issues.
Your llms.txt should link to markdown or plain text versions of pages, not HTML. Options:
Keep your llms.txt file under 10k tokens. The file itself should be compact—a table of contents, not the full content. Each page it links to should also stay under 10k tokens. This ensures efficient context usage and faster processing by AI agents.
Yes, the URLs in your llms.txt must return content with Content-Type text/plain or text/markdown. If your site only serves HTML, use our CREATE tool which automatically scrapes and converts your pages to markdown using the Parallel Extract API.
Click any "⬇ INSTALL" button on our site to visit installthismcp.com, which provides step-by-step instructions for installing MCP servers into Claude Desktop or other compatible AI assistants.
A high-quality llms.txt has:
Update your llms.txt whenever you add, remove, or significantly reorganize content. For documentation sites, this might be with each release. For blogs, consider updating monthly or when adding major content sections.
MCP servers fetch content dynamically, so changes to your llms.txt are reflected immediately. There's no need to "reinstall" or update the MCP—it always reads the current version of your llms.txt file.
The standard llms.txt lives at your domain root, but you can create llms.txt files for subdomains (e.g., docs.yoursite.com/llms.txt). Each subdomain's llms.txt can have its own MCP server.
Vector search (RAG) finds semantically similar content but loses document structure. llms.txt provides a navigable table of contents, allowing AI to reason about organization and relationships between topics. Both approaches are valuable—llms.txt excels for well-structured documentation, while RAG works better for unstructured knowledge bases.
Use both! Sitemaps help search engines discover URLs. llms.txt provides AI-optimized content structure with markdown/text links. They serve complementary purposes—sitemaps for discovery, llms.txt for efficient AI consumption.
Any AI assistant with MCP support (like Claude Desktop) can use llms.txt via our MCP servers. Additionally, AI agents and browsers that follow the llmstxt.org standard can directly read these files. The ecosystem is growing rapidly.
Yes! If your CMS supports markdown export or custom content types, you can generate llms.txt directly. If not, use our CREATE tool to automatically generate both the llms.txt and markdown versions from your existing HTML content.
As AI traffic grows, companies with llms.txt-enabled sites will:
Early adoption positions your content for the next wave of web traffic.
llms.txt is complementary to traditional SEO. It doesn't affect traditional search rankings but optimizes your site for "agent search optimization" (ASO)—making your content more discoverable and accurately represented by AI agents and chatbots, which are becoming major traffic sources.
llms.txt is an emerging community standard defined at llmstxt.org. While not yet an official W3C standard, it's being rapidly adopted by major documentation sites and AI tool developers. Think of it like robots.txt in its early days.
Check our 03 INSTALL tab to see popular sites already using llms.txt. The list includes major documentation platforms, API providers, and developer tools. You can browse by popularity and tokens ingested.