I've been noticing a trend where many users are getting their answers from AI tools such as ChatGPT and Perplexity instead of traditional search engines. This got me thinking: how can we effectively track if our websites are being cited or referenced in the responses generated by these AI systems?
Do folks here manually test queries to see if their site pops up, or have you developed a more systematic approach? Some people mention that this 'SEO for AI' field is really growing, and I'd love to hear what strategies you're using to keep tabs on your site's AI visibility.
5 Answers
Honestly, relying on IP data may not be effective anymore, especially as AI tools continue to gather info from users. If you're expecting compensation for your ideas, that's unlikely. Monetizing your thoughts in this context seems silly.
One approach I use is tracking referrer data through server logs. For some of my larger sites, I find that up to 10% of traffic on certain pages comes from AI sources, which is substantial!
It’s basically just SEO work. Your site gets included the same way it does with search engines—unless you’ve got a specific strategy for AI, it might be the same results as before.
I have a friend who's a photographer and she got a client solely because ChatGPT recommended her. The client searched for a photographer in her area, and she was the top suggestion! Seems like this is a new way to do SEO.
That's a great example! It's proof that AI can actually help businesses. Did she do anything special to end up there, or was it just luck?
The market is saturated with tools designed for this kind of tracking—names like Knowatoa, Ahrefs, and Writesonic pop up a lot. What makes your tracking approach stand out from these existing solutions?
Absolutely, it's a crowded space. But there must be some gaps, right? What do you feel is missing from the current tools?
That's interesting—10% is quite the chunk. Have you noticed that percentage increasing over time?