Mobile app version of vmapp.org
Login or Join
Barnes591

: What is an acceptable "Time spent downloading a page (in milliseconds)" in Google Search Console? Is there any generally accepted average time for crawling page in Google Webmasters Tools? You

@Barnes591

Posted in: #GoogleSearchConsole #Performance

Is there any generally accepted average time for crawling page in Google Webmasters Tools? You can find this report on the crawl stats page.

I am trying to audit possible SEO issues, my website currently has average time to crawl around 700ms.

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Barnes591

1 Comments

Sorted by latest first Latest Oldest Best

 

@Megan663

That report shows how much time Googlebot spends downloading each page on your site, not including linked resources such as JS, CSS, iframes, and images.

It is less important to optimize your website for Googlebot than it is to optimize it for users. Instead of looking at this report, start by looking at how long it takes to request, download all resources, and render your pages. Developer tools in Firefox and Chrome can show detailed breakdowns of each stage like this:


As long as the time to download the page is under 3 second (3000 ms) you are fine from an SEO perspective. Google starts penalizing sites that take 7 or more seconds. Improvements under 3 seconds can be great for users, but don't typically get you better search engine rankings.

The techniques for improving total performance for users can include putting JS and CSS inline in the page. That can slow your initial page load time down for Googlebot, but it is important to optimize for users first. For a better list of how to optimize your site for users see: Ideas to improve website loading speed?

Back to your question about the time that Googlebot spends downloading the initial page. When making optimizations, it is possible to make optimizations that are better for Googlebot but bad for users. For example, you could try moving CSS and JS into files rather than inline. Then Googlebot wouldn't have to download them every page. Resist the temptation to optimize for Googlebot at the expense of users.

The initial page download that Googlebot sees is also what users start with. Most optimizations done for Googlebot in this area will also improve user experience. 700ms is not great. If it takes almost a second for the initial page load, you can easily see how the full page with images could take many seconds to render. I would try to aim for no more than 300ms. Your browser tools will give you some insight into where to look. If you see that it is 400ms between when the request is made and when the server responds, that usually indicates that the server is sitting and thinking for a long time. You can remedy that by:


Getting a faster server
Moving to static pages rather than dynamic
Implementing caching
Improving database performance
Optimizing database queries
Reducing the number of database queries


If you are using a CMS such as WordPress, caching plugins can really help.

When you improve the performance for Googlebot, you will find that Googlebot is able to crawl your site more frequently and more deeply. The speed at which your servers can feed pages to Googlebot is often the limiting factor in how much Googlebot is willing to crawl your site.

A really fast site will take 50ms or so per page. I have been able to optimize one of my sites to close to that:

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme