Mobile app version of vmapp.org
Login or Join
Kevin317

: Will a dynamic robots.txt file that disallows crawling based on the time of day hurt SEO? We have a serious traffic issue on our site and we want to eliminate crawlers as part of the problem.

@Kevin317

Posted in: #RobotsTxt #Seo #WebCrawlers

We have a serious traffic issue on our site and we want to eliminate crawlers as part of the problem. If we disallow crawling using robots.txt only during some hours, will it hurt our SEO?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Kevin317

1 Comments

Sorted by latest first Latest Oldest Best

 

@Courtney195

Focus your energy on improving your site/server instead of trying some hacky workaround. You never know what time of day search engines will crawl your site, so only allowing them during certain windows may have a hefty negative affect. If you block crawlers when Google comes along, you won't like the results.

What you can do for Google is set the crawl rate. Limiting the crawl rate ("how many requests per second Googlebot makes to your site when it is crawling it: for example, 5 requests per second") will help alleviate the load impact without completely blocking them. If you have a massive site then this may mean a longer delay before new updates are indexed, but it's better than dropping off entirely.

At least for Bing, you do have a bit more control. As per their how to page:


You can tell Bingbot to crawl your site faster or slower than the
normal crawl rate for each of the 24 hours in the day. This way you
can to limit Bingbot activity when your visitors are on your site and
allow us more bandwidth during quieter hours.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme