How Can I Deal with 200,000 Daily Requests from AI Crawlers?

0
1
Asked By PixelatedNinja99 On

I'm running a MediaWiki website that's all about Pokémon, and since the recent Pokémon Z/A announcements, I've been hit with over 200,000 requests per day from various AI crawlers, while I barely had any traffic before. I'm looking for realistic solutions to manage or reduce this incoming traffic. Is this something I have to fully accept or are there effective ways to combat it?

4 Answers

Answered By CleverBot123 On

If you're in a pinch and need a quick fix, try updating your robots.txt file to disallow the bots you don't want access. It could help in the short term at least!

Answered By CloudyDays42 On

You might want to check out Cloudflare. They offer some solid blocking features, even for their free users. I came across it recently when someone else had a similar issue. It’s definitely worth a look!

Answered By TechieTrendsetter On

Cloudflare is a great option and handles this well! But if you're looking for a free solution, you might have to get creative with your blocking methods.

SwiftCoder88 -

Isn't Cloudflare's basic tier free though? It seems like it could be a good starting place.

Answered By FirewallGuru88 On

In my opinion, that kind of traffic isn't helpful at all. It could be wise to ban the IP addresses of those crawlers directly in your server's firewall. Plus, there are usually MediaWiki tools or guidelines available that can help with this.

Related Questions

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.