I'm running a MediaWiki website that's all about Pokémon, and since the recent Pokémon Z/A announcements, I've been hit with over 200,000 requests per day from various AI crawlers, while I barely had any traffic before. I'm looking for realistic solutions to manage or reduce this incoming traffic. Is this something I have to fully accept or are there effective ways to combat it?
4 Answers
If you're in a pinch and need a quick fix, try updating your robots.txt file to disallow the bots you don't want access. It could help in the short term at least!
You might want to check out Cloudflare. They offer some solid blocking features, even for their free users. I came across it recently when someone else had a similar issue. It’s definitely worth a look!
Cloudflare is a great option and handles this well! But if you're looking for a free solution, you might have to get creative with your blocking methods.
In my opinion, that kind of traffic isn't helpful at all. It could be wise to ban the IP addresses of those crawlers directly in your server's firewall. Plus, there are usually MediaWiki tools or guidelines available that can help with this.
Isn't Cloudflare's basic tier free though? It seems like it could be a good starting place.