: How to set crawl rate in Google Webmaster Tools for seven days or a month? My web host is blocking IPs which make many requests. This includes Googlebot. I found this by checking my webmaster
My web host is blocking IPs which make many requests. This includes Googlebot. I found this by checking my webmaster tools account. There it said "network unreachable" for all the URLs which are indexed in Google.
Then I thought of reducing the rate of crawl. I found that Google will not alter the crawl rate. It is only Google who will determine the crawl rate. Users are not allowed to alter the number of requests per time.
Some time ago I had seen the crawl rate settings in Google Webmaster Tools with options like days, week and months. Now it is different. How to set crawl rate for the above different criteria?
More posts by @Courtney195
2 Comments
Sorted by latest first Latest Oldest Best
Setup a sitemap xml file, and specify the frequency which your data changes. This wont force the bots to stop visiting your site, but it should reduce it.
Something like this:
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>http://www.example.com/</loc>
<lastmod>2005-01-01</lastmod>
<changefreq>weekly</changefreq>
<priority>0.8</priority>
</url>
</urlset>
Google Webmaster Tools > YourSite > Site configuration > Settings > Crawl rate > Set custom crawl rate.
The slowest rate possible is 500 seconds between requests.
This link might help your "server people": googlewebmastercentral.blogspot.com/2006/09/how-to-verify-googlebot.html
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.