: Google reports a sudden increase in 404 errors from crawling pages My server went down for a couple days.(2-3 august) 15 days later, Google Webmaster Tools notified me of increase in 404 errors.
My server went down for a couple days.(2-3 august)
15 days later, Google Webmaster Tools notified me of increase in 404 errors.
The problem is that Google reports 70,000 error pages while it has ever indexed maximum 7000 pages, as shows the index status:
While this is the crawl error page:
How can I understand which pages Google cannot find? I can see maximum 1000 errors but these 1000 errors are old and were present before too. And how is that possible that it cannot find 70,000 pages if it had ever indexed maximum 7000?
Note I have tried to run Xenu link checker to check for internal broken links and none were found.
Note2: In correspondence with the spike in errors (21th august) there were a spike in crawled pages too:
More posts by @Connie744
1 Comments
Sorted by latest first Latest Oldest Best
This could happen in the case you have a lot of URL parameters which usually result in a lot of pages being created -Googlebot automatically ensures that not all of these are indexed. However, when your site went down all of these could have shown up as 404 errors.
How does one analyse and clear these?
There is a priority column in the crawl errors report. Sort from 1 to the last and start fixing from 1. Usually we have seen that solving the first 1 has an effect of solving a lot of other errors - In your case 100-1000s of errors will disappear if you solve the first (Since you have 70,000 errors)
We faced a similiar issue recently and this helped. Let us know how that went.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.