Mobile app version of vmapp.org
Login or Join
Goswami781

: How exactly is Google Webmaster Tools measuring "Site Performance"? I've been working for two months now on improving our response time (mainly server side) on a new forum (a brand new product

@Goswami781

Posted in: #GoogleSearchConsole #Performance

I've been working for two months now on improving our response time (mainly server side) on a new forum (a brand new product on a technical point of view) we've launched in Germany a few month ago and I'm a lot surprised by the results I get. I monitor our response time using Apache logs and our own implementation of Boomerang beacon.

Using my stats, I can see that our new product responds in about 680 ms where our old product was responding in about 1050 ms. On the other side, Google Webmaster Tool tells us that our pages have an average response time of about 1500 ms today where it was 700 three months ago with our old product.

I've figured that GWT was taking client side metrics into account so I've added some measures on our Boomerang beacon and everything looks just fine. I've also ran some random pages on ySlow and Google's Page Speed and everything looks better than it was before. We event have a 82% on Google's Page Speed tool which is quite cool for a site with some ads in it :)

Lately, we have signed a deal with Akamai to use two of their products : CDN for our static files (we were using another CDN before but it wasn't very effective) and RMA to improve Networks routes. We have also introduced a new aggressive cache mechanism to ensure that most of the pages served to crawlers are cached by our memcache grid. After checking my metrics, it seems that this changes have improved from 650ms to about 500ms, which is good (still not great but it is definitely an improvement). But webmaster tools continues to report an increasing average response time where we see it decreasing in the same time.

Have you ever had the same kind of weird behavior on your sites while doing performance improvements ? Do you have any idea how to monitor the same thing Google does with Site Performance in Google Webmaster Tools so that we could improve our site and constantly check if it is what Google wants ?

Edit 2011/07/26 : Thanks for your answers guys ! Nevertheless, I was not precise enough. The main issue we have is not with the Site Performance page but with the Crawl Stats one for now. We probably found an issue on our side with some very slow pages (around 3000 ms !!) and we are trying to fix them. I'll keep you posted as soon I'll have some infos.
Thanks again !

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Goswami781

1 Comments

Sorted by latest first Latest Oldest Best

 

@Voss4911412

Per the official guidance
www.google.com/support/webmasters/bin/answer.py?answer=158541

Site Performance is an experimental Webmaster Tools Labs feature that shows you latency information about your site. (To see Site Performance data, you must add and verify your site in Webmaster Tools.)

Page load time is the total time from the moment the user clicks on a link to your page until the time the entire page is loaded and displayed in a browser. It is collected directly from users who have installed the Google Toolbar and have enabled the optional PageRank feature.


Since users can often interact with web pages before they are fully downloaded, this is a very strict interpretation of website speed. But it's reasonable to be extra strict on this measure because if a page has tons of dynamic JavaScript and dynamically loaded ads, it's arguably more correct to for the user to see the page as loading slowly.

Therefore, the way they measure this is by using the Network tab in Google Chrome tools aka ctrl+shift+I.



The two relevant events are DOMContent Event Fired (blue line) and Load event fired (red line). On a random page here on this site, that means the numbers are around 600 ms and 1.1 sec, respectively. That's much, much larger than the time to download a page from the command line using wget -- and clearly reflects time the client browser is spending to render the content it downloads via HTTP.

(It also seems a tad unfair to me, since a website with all static pages is going to have a huge advantage over one that dynamically customizes each page for the specific user, but I guess them's the breaks!)

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme