As AI models like ChatGPT, Claude, and Google Gemini increasingly crawl the web to learn from public content, website owners need better control over how their data is used. That’s where the llms.txt
file comes in — a new way to manage AI bot access and protect your content for SEO and privacy.
What is llms.txt?
llms.txt
is a plain text file placed at the root of your website (like kodepen.com/llms.txt
). It works similarly to robots.txt
, but instead of managing search engine crawlers, it tells AI bots and language models (LLMs) which parts of your site they can or cannot use for training or referencing.
How to Create an llms.txt File
- Open any text editor (like Notepad, Sublime Text, or VS Code).
- Write rules like this: makefileCopyEdit
User-agent: GPTBot
Disallow: /private/
User-agent: ClaudeBot
Allow: /
- Save the file as llms.txt.
- Upload it to your website’s root directory using FTP or your hosting file manager.
Why It Matters for SEO and AI
- Protects content from being used by AI without permission
- Improves SEO control over how AI models interpret and present your content
- Builds ethical boundaries for AI data usage
By using llms.txt
, you’re staying ahead of the curve in managing your content in the age of AI.