: Defense against rude web crawlers Possible Duplicate: How To Track Down and Stop Rogue Bots? I had an incident yesterday where my web site was taken down by a web crawler that
Possible Duplicate:
How To Track Down and Stop Rogue Bots?
I had an incident yesterday where my web site was taken down by a web crawler that was ignoring my robots.txt. I'm pretty sure nothing malicious was intended, but the crawler wandered into pages that completely overloaded my database with time consuming requests.
What to do for "next time"?
(1) Obviously make it harder to trigger onerous requests by just clicking on a link.
(2) The other idea I have is to add "poison pill" links that would be invisible to humans, but would have a side effect of marking the IP address as a robot.
The question is, would this be likely to trigger false positives, for example because the browser was trying to prefetch the "mark-me-as-a-robot" link.
More posts by @Jennifer507
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.