I'm dealing with a pesky problem on my web server where malicious crawlers keep trying to find my exposed credentials by checking the `/.env` file. I'm looking for some creative yet harmless ways to serve misleading content at that endpoint to throw them off the scent. Here are some ideas I've been considering: 1) Providing a lot of text to overload their scraping process (but I'm also mindful of my bandwidth costs), 2) Showing fake credentials to keep them busy, and 3) Including various injection attacks like SQL, XSS, etc., to confuse their intelligence further. I'm curious if anyone has implemented something similar or has better suggestions?
6 Answers
Or just redirect them to your YouTube channel for some extra views!
You could also set up your server to stream one random byte every minute. It would keep them downloading, but only barely! Talk about slow responses!
Why not pull a trick with a fake crypto website? Feed them a fake site and some credentials that let them 'log in' and then show them they need to deposit crypto to withdraw their 'gains.' It could waste a lot of their time!
That’s a bit extreme, don’t you think? Sounds like a solid fraud scheme.
Yeah, this is a terrible idea. Definitely illegal.
What if you check out HellPot? It’s designed to lead bots that ignore robots.txt into a stream of Nietzsche's writings. It could be a funny and absurd way to keep them busy.
Honestly, just avoid complicating things too much. Crawlers scan tons of pages quickly, and you could end up flagged as a target. Simple might just be safer!
You could try using a zip bomb to throw off crawlers. They probably won't know how to handle the zip file and will just treat it as text! But honestly, I doubt they'll manage to unzip it anyway, so it might just be a fun gamble.
Yeah, I feel like most bots wouldn't even attempt that. They usually just grab whatever text they find.
That sounds complicated, almost like a twisted SlowLoris attack!