: Is this Anti-Scraping technique viable with Crawl-Delay? Possible Duplicate: How do spambots work? I want to prevent web scrapers from abusing 1,000,000 on my website. I'd like to
Possible Duplicate:
How do spambots work?
I want to prevent web scrapers from abusing 1,000,000 on my website. I'd like to do this by returning a "503 Service Unavailable" error code for users that access an abnormal number of pages per minute.
I don't want search engine spiders to ever receive the error. My inclination is to set a robots.txt crawl-delay which will ensure spiders access a number of pages per minute under my 503 threshold.
Is this an appropriate solution? Do all major search engines support the directive? Could it negatively affect SEO? Are there any other solutions or recommendations?
More posts by @Moriarity557
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.