: Googlebot craw rate is too slow for a huge site I have a new website with 140k pages, according to my sitemaps. After submission of my sitemap index to Google, the Googlebot has started crawling
I have a new website with 140k pages, according to my sitemaps.
After submission of my sitemap index to Google, the Googlebot has started crawling the site a couple of hours ago, and at a steady pace, but it's doing so too slowly: roughly one request per minute.
At such rate, it'll take 100 days for the site to be indexed! Is there a way to increase the rate? Why does all Google documentation suggest that they don't recommend changing the crawl rate, and that changing the crawl rate will only make the crawling slower?
More posts by @Murray432
1 Comments
Sorted by latest first Latest Oldest Best
To get Googlebot to crawl faster you need:
A fast server. The less time that it takes for Googlebot to download each page the faster it will crawl. Don't worry about the images, css, and javascript. Just serve the html faster. I find that enabling gzip compression helps. Cache and preload some of your site's data if appropriate. Shrink your page sizes as much as possible.
More pagerank. The higher your site's pagerank, the more Googlebot will crawl. It will also return and re-crawl pages with higher pagerank much more frequently.
If you don't have enough pagerank, Google will probably also choose not to index a large portion of your pages, even once it does crawl them.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.