Mobile app version of vmapp.org
Login or Join
Radia820

: Google webmaster Tool showing : 170 not found 404 errors! I have a website which is 2 years old and it is simple PHP pages. I had set up Google Analytics and Google Webmaster Tools for it.

@Radia820

Posted in: #CrawlErrors #GoogleSearchConsole #Seo

I have a website which is 2 years old and it is simple PHP pages.

I had set up Google Analytics and Google Webmaster Tools for it.

Just last week I removed that PHP website and installed a new WordPress website. I just created pages with the old site's content and also add few new images.

The issue is: when I logged into my Google Webmaster Tools account there were 170 404 not found errors!

How do I resolve the error messages saying 404 not found?

How can I remove not existing page links from Google Webmaster Tools?

Will it affect my website traffic?

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Radia820

2 Comments

Sorted by latest first Latest Oldest Best

 

@Heady270

Here is what Google's John Mueller (who works on Webmaster Tools and Sitemaps) has to say about 404 errors that appear in Webmaster tools:


HELP! MY SITE HAS 939 CRAWL ERRORS!!1

I see this kind of question several times a week; you’re not alone - many websites have crawl errors.


404 errors on invalid URLs do not harm your site’s indexing or ranking in any way. It doesn’t matter if there are 100 or 10 million, they won’t harm your site’s ranking. googlewebmastercentral.blogspot.ch/2011/05/do-404s-hurt-my-site.html
In some cases, crawl errors may come from a legitimate structural issue within your website or CMS. How you tell? Double-check the origin of the crawl error. If there's a broken link on your site, in your page's static HTML, then that's always worth fixing. (thanks +Martino Mosna)
What about the funky URLs that are “clearly broken?” When our algorithms like your site, they may try to find more great content on it, for example by trying to discover new URLs in JavaScript. If we try those “URLs” and find a 404, that’s great and expected. We just don’t want to miss anything important (insert overly-attached Googlebot meme here). support.google.com/webmasters/bin/answer.py?answer=1154698 You don’t need to fix crawl errors in Webmaster Tools. The “mark as fixed” feature is only to help you, if you want to keep track of your progress there; it does not change anything in our web-search pipeline, so feel free to ignore it if you don’t need it.
support.google.com/webmasters/bin/answer.py?answer=2467403 We list crawl errors in Webmaster Tools by priority, which is based on several factors. If the first page of crawl errors is clearly irrelevant, you probably won’t find important crawl errors on further pages.
googlewebmastercentral.blogspot.ch/2012/03/crawl-errors-next-generation.html There’s no need to “fix” crawl errors on your website. Finding 404’s is normal and expected of a healthy, well-configured website. If you have an equivalent new URL, then redirecting to it is a good practice. Otherwise, you should not create fake content, you should not redirect to your homepage, you shouldn’t robots.txt disallow those URLs -- all of these things make it harder for us to recognize your site’s structure and process it properly. We call these “soft 404” errors.
support.google.com/webmasters/bin/answer.py?answer=181708 Obviously - if these crawl errors are showing up for URLs that you care about, perhaps URLs in your Sitemap file, then that’s something you should take action on immediately. If Googlebot can’t crawl your important URLs, then they may get dropped from our search results, and users might not be able to access them either.

10% popularity Vote Up Vote Down


 

@Gloria169

The 170 not found 404 errors are likely because Google can no longer find the previous PHP pages or content.

If your new WordPress pages and content match your previous ones, then create 301 redirects in your web server configuration from the old URLs to the new ones to let Google know that they moved.

If you just wish to add the new URLs and remove the old ones, create a sitemap with the new URLs and submit it to Google. Then see remove your own content from Google search results to remove the old URLs. To remove an entire directory, you can use this under Google Webmaster Tools: Remove your entire site or a directory.

In that case, you may also want to block search engine robots from trying to crawl the older URLs by disallowing them in your robots.txt - see this and this for more on how to do so.

In regards to website traffic, since search engine users are likely also receiving the 404 errors, as well as anyone comming from a site linking to the previous URLs, visitors may bounce (leave) from your site. Redirecting to matching content with 301 redirects should help to retain them.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme