: Google Webmaster Tools reporting non-existent links GWT is reporting 404 crawl errors on pages that have never been part of the website. For example, it's reporting crawl errors on several non-existent
GWT is reporting 404 crawl errors on pages that have never been part of the website.
For example, it's reporting crawl errors on several non-existent pages such as mysite.net/blog/how-can-i-improve-seo-ranking. On the 'Linked from' tab, it lists several internal pages that don't even have this page linked from. Also on the 'Linked from' tab, it lists the sitemap, which is updated once a week and never has links to these non-existent pages.
I am unable to figure out what exactly is causing this- is it something to do with the website itself or is GWT wrongly identifying pages? If these pages are not even listed on the sitemap or linked from any of the internal pages, why is GWT reporting this? Does anyone know how to fix this?
More posts by @Rivera981
2 Comments
Sorted by latest first Latest Oldest Best
Hi this is due to a 3rd party site linking to your domain to a page on your site that doesn't exist. So Google has followed a link from another website to yours - but because the link is incorrect at the other website when Google hits your site it gets a 404 error because the page that's being linked to doesn't exist. This isn't your fault as is usually due to a typo on the other website or spam.
It is possible that your DNS entries were messed up or served incorrectly for a while. Then your site's domain name might have been pointing to a different server with a different site on it for a while. Google may have crawled that other site under your domain name and is now complaining about the page that it found during that time period.
Another possibility is that you are using shared hosting that was misconfigured for a time. Your host may have been showing the wrong content for your domain name.
It is also possible that your site has malware installed on it that is showing different content to Google. Use the "fetch as Google" feature in Google Webmaster tools to verify that Google is getting the content that you expect.
As long as your site is currently showing the correct content and links, those 404 errors reporting in Google Webmaster Tools shouldn't hurt you. Here is what Google's John Mueller (who works on Webmaster Tools and Sitemaps) has to say about 404 errors that appear in Webmaster tools:
HELP! MY SITE HAS 939 CRAWL ERRORS!!1
I see this kind of question several times a week; you’re not alone - many websites have crawl errors.
404 errors on invalid URLs do not harm your site’s indexing or ranking in any way. It doesn’t matter if there are 100 or 10 million, they won’t harm your site’s ranking. googlewebmastercentral.blogspot.ch/2011/05/do-404s-hurt-my-site.html
In some cases, crawl errors may come from a legitimate structural issue within your website or CMS. How you tell? Double-check the origin of the crawl error. If there's a broken link on your site, in your page's static HTML, then that's always worth fixing. (thanks +Martino Mosna)
What about the funky URLs that are “clearly broken?” When our algorithms like your site, they may try to find more great content on it, for example by trying to discover new URLs in JavaScript. If we try those “URLs” and find a 404, that’s great and expected. We just don’t want to miss anything important (insert overly-attached Googlebot meme here). support.google.com/webmasters/bin/answer.py?answer=1154698 You don’t need to fix crawl errors in Webmaster Tools. The “mark as fixed” feature is only to help you, if you want to keep track of your progress there; it does not change anything in our web-search pipeline, so feel free to ignore it if you don’t need it.
support.google.com/webmasters/bin/answer.py?answer=2467403 We list crawl errors in Webmaster Tools by priority, which is based on several factors. If the first page of crawl errors is clearly irrelevant, you probably won’t find important crawl errors on further pages.
googlewebmastercentral.blogspot.ch/2012/03/crawl-errors-next-generation.html There’s no need to “fix” crawl errors on your website. Finding 404’s is normal and expected of a healthy, well-configured website. If you have an equivalent new URL, then redirecting to it is a good practice. Otherwise, you should not create fake content, you should not redirect to your homepage, you shouldn’t robots.txt disallow those URLs -- all of these things make it harder for us to recognize your site’s structure and process it properly. We call these “soft 404” errors.
support.google.com/webmasters/bin/answer.py?answer=181708 Obviously - if these crawl errors are showing up for URLs that you care about, perhaps URLs in your Sitemap file, then that’s something you should take action on immediately. If Googlebot can’t crawl your important URLs, then they may get dropped from our search results, and users might not be able to access them either.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.