Mobile app version of vmapp.org
Login or Join
Moriarity557

: Is this Anti-Scraping technique viable with Crawl-Delay? Possible Duplicate: How do spambots work? I want to prevent web scrapers from abusing 1,000,000 on my website. I'd like to

@Moriarity557

Posted in: #Googlebot #ScraperSites #WebCrawlers

Possible Duplicate:
How do spambots work?




I want to prevent web scrapers from abusing 1,000,000 on my website. I'd like to do this by returning a "503 Service Unavailable" error code for users that access an abnormal number of pages per minute.

I don't want search engine spiders to ever receive the error. My inclination is to set a robots.txt crawl-delay which will ensure spiders access a number of pages per minute under my 503 threshold.

Is this an appropriate solution? Do all major search engines support the directive? Could it negatively affect SEO? Are there any other solutions or recommendations?

10% popularity Vote Up Vote Down


Login to follow query

More posts by @Moriarity557

0 Comments

Sorted by latest first Latest Oldest Best

Back to top | Use Dark Theme