: How to interpret number of URL errors in Google webmaster tools Recently Google has made some changes to Webmaster tools which are explained below: http://googlewebmastercentral.blogspot.com/2012/03/crawl-errors-next-generation.html
Recently Google has made some changes to Webmaster tools which are explained below: googlewebmastercentral.blogspot.com/2012/03/crawl-errors-next-generation.html
One thing I could not find out is how to interpret the number of errors over time. At the end of February we've recently migrated our website and didn't implement redirect rules for some pages (quite a few actually). Here is what we're getting from the Crawl errors:
What I don't know is if the number of errors is cumulative over time or not (i.e. if Google bots crawl your website on 2 different days and find 1 separate issue on each day, whether they will report 1 error for each day, or 1 for the 1st, and 2 for the 2nd).
Based on the Crawl stats we can see that the number of requests made by Google bots doesn't increase:
Therefore I believe the number of errors reported is cumulative and that an error detected on 1 day is taken into account and reported on the subsequent days until the underlying problem is fixed and the page it's crawled again (or if you manually Mark as fixed the error) because if you don't make more requests to a website, there is no way you can check new pages and old pages at the same time.
Q: Am I interpreting the number of errors correctly?
More posts by @RJPawlick198
1 Comments
Sorted by latest first Latest Oldest Best
Yes, you are interpreting the numbers correctly. A crawl error wil continue to be reported unti the page is returned, a redirect is set up, or Google stops attempting to crawl the page. So as Google finds more pages with crawl errors you will see your number of crawl errors rise.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.