Mobile app version of vmapp.org
Login or Join
Murray432

: Googlebot craw rate is too slow for a huge site I have a new website with 140k pages, according to my sitemaps. After submission of my sitemap index to Google, the Googlebot has started crawling

@Murray432

Posted in: #Googlebot #GoogleSearchConsole #WebCrawlers

I have a new website with 140k pages, according to my sitemaps.

After submission of my sitemap index to Google, the Googlebot has started crawling the site a couple of hours ago, and at a steady pace, but it's doing so too slowly: roughly one request per minute.

At such rate, it'll take 100 days for the site to be indexed! Is there a way to increase the rate? Why does all Google documentation suggest that they don't recommend changing the crawl rate, and that changing the crawl rate will only make the crawling slower?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Murray432

1 Comments

Sorted by latest first Latest Oldest Best

 

@Eichhorn148

To get Googlebot to crawl faster you need:


A fast server. The less time that it takes for Googlebot to download each page the faster it will crawl. Don't worry about the images, css, and javascript. Just serve the html faster. I find that enabling gzip compression helps. Cache and preload some of your site's data if appropriate. Shrink your page sizes as much as possible.
More pagerank. The higher your site's pagerank, the more Googlebot will crawl. It will also return and re-crawl pages with higher pagerank much more frequently.


If you don't have enough pagerank, Google will probably also choose not to index a large portion of your pages, even once it does crawl them.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme