I recently experimented with llms.txt and tested it on both ChatGPT and Claude. Surprisingly, neither seemed to take notice of the content, even though I embedded hidden instructions for AIs to read the llms.txt file on every page in the section. I'm curious if anyone else has had any success with this format or tips on how to get AIs to utilize it effectively?
4 Answers
I've just been linking to llms.txt directly. It saves a lot of tokens, which is great! Everything runs smoother this way, especially on platforms like ChatGPT.
It's more of a grassroots effort to get llms.txt adopted. We're still convincing more people to utilize it, but I've seen some positive reactions so far.
llms.txt is designed to pack a lot of information efficiently, which is great for AI. Recently, I made one for my component documentation. For example, I tried generating a page with the new Nuxt UI v4 components, but the AI initially used the outdated version. Once I redirected it to llms.txt, it generated the page perfectly using the latest components!
I published some stats on my experience with llms.txt. You can find it here: [https://www.sygnal.com/blog/llms-txt-webflow](https://www.sygnal.com/blog/llms-txt-webflow). It might give you more insight!
+1 for that method! Feeding it the link seems to keep the conversation much more consistent, at least with ChatGPT. I haven't had a chance to test it on Claude yet.