Mobile app version of vmapp.org
Login or Join
Connie744

: Google reports a sudden increase in 404 errors from crawling pages My server went down for a couple days.(2-3 august) 15 days later, Google Webmaster Tools notified me of increase in 404 errors.

@Connie744

Posted in: #CrawlErrors #GoogleIndex

My server went down for a couple days.(2-3 august)
15 days later, Google Webmaster Tools notified me of increase in 404 errors.

The problem is that Google reports 70,000 error pages while it has ever indexed maximum 7000 pages, as shows the index status:



While this is the crawl error page:



How can I understand which pages Google cannot find? I can see maximum 1000 errors but these 1000 errors are old and were present before too. And how is that possible that it cannot find 70,000 pages if it had ever indexed maximum 7000?

Note I have tried to run Xenu link checker to check for internal broken links and none were found.

Note2: In correspondence with the spike in errors (21th august) there were a spike in crawled pages too:

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Connie744

1 Comments

Sorted by latest first Latest Oldest Best

 

@Cooney921

This could happen in the case you have a lot of URL parameters which usually result in a lot of pages being created -Googlebot automatically ensures that not all of these are indexed. However, when your site went down all of these could have shown up as 404 errors.

How does one analyse and clear these?

There is a priority column in the crawl errors report. Sort from 1 to the last and start fixing from 1. Usually we have seen that solving the first 1 has an effect of solving a lot of other errors - In your case 100-1000s of errors will disappear if you solve the first (Since you have 70,000 errors)

We faced a similiar issue recently and this helped. Let us know how that went.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme