: Is it normal to have some huge anomolies in Google Analytics site speed tracking? I've recently started using the new _trackPageLoadTime function in Google Analytics to measure page load times.
I've recently started using the new _trackPageLoadTime function in Google Analytics to measure page load times. However, I have noticed a few pages appear with a huge "Average Load Time". One page is showing a time of 556.22 seconds! Several other pages show average times of 30+ seconds.
Clicking through to the page detail shows that the highest average times are the result of 1 really long load time of 100+ seconds (and most of the top ones only have a "Page Load Sample" of 1 or 2 people).
Are these just anomalies where the user lost connection or something? Is there any way to view the data without those anomalies?
More posts by @Cofer257
2 Comments
Sorted by latest first Latest Oldest Best
Google Analytics Site Speed tracking utilizes the HTML5 Navigation Timing API. I wrote a fairly detailed answer on it on StackOverflow.
Specifically, it uses these 2 attributes on browsers that have the performance.timing object:
fetchStart attribute
If the new
resource is to be fetched using HTTP
GET or equivalent, fetchStart must
return the time immediately before the
user agent starts checking any
relevant application caches.
Otherwise, it must return the time
when the user agent starts fetching
the resource.
loadEventStart attribute
This
attribute must return the time
immediately before the load event of
the the current document is fired. It
must return zero when the load event
is not fired yet.
That latter event corresponds to the window load event (window.onload). Though this is an important event, this is not the same as when your page becomes usable. So, your page could very well be functioning, but the onload event could be held up by a particularly slow resource on your page.
Does your site include external assets? Advertising? Many social plugins? It's possible that its causing your window load event to be delayed.
It's important to keep in mind that the sampling rate for this feature is by default very low. It's configured to try to sample 10% of visits, but in practice, depending on your level of support, it can be anything between 2%-4% of traffic. So, single outliers can have very significant effects on the quality of the data.
You can get a better idea of how reliable the data distribution is by drilling down into the particular pageview's sampling data to get the distribution
If you're interested in a different data point for your performance, you could use events to construct a custom performance tracker. (It would, however, ruin your bounce rate, as an event would count as an interaction.) For example, maybe the DOM Ready event is a better barometer of when your site becomes usable
if(window.performance && performance.timing)
{
var ms = performance.timing.domContentLoadedEventStart - performance.timing.fetchStart;
_gaq.push(["_trackEvent", "Performance", "Time to DOM Ready","",ms]);
}
(Anecodetally, yes, its very common. Check out some of these times from one of my accounts):
They could be someone on dial-up who took ages to load your page, or someone with a congested connection.
If you can find out when they visited you can exclude that day from your figures. Or if you can spot some other characteristic, such as their country or network, you can create a custom segment that excludes those visitors.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.