: Do many broken internal links impact site quality negatively regarding rankings? Will a website with a high ratio of broken internal links be considered low quality? Will it therefore not rank
Will a website with a high ratio of broken internal links be considered low quality? Will it therefore not rank well? Does anyone have an official reference confirming this?
More posts by @Sue5673885
2 Comments
Sorted by latest first Latest Oldest Best
Broken internal links are "worth fixing" according to Google's John Mueller. He works on Webmaster Tools and Sitemaps and he has the following to say about 404 errors that appear in Webmaster tools:
HELP! MY SITE HAS 939 CRAWL ERRORS!!1
I see this kind of question several times a week; you’re not alone - many websites have crawl errors.
404 errors on invalid URLs do not harm your site’s indexing or ranking in any way. It doesn’t matter if there are 100 or 10 million, they won’t harm your site’s ranking. googlewebmastercentral.blogspot.ch/2011/05/do-404s-hurt-my-site.html
In some cases, crawl errors may come from a legitimate structural issue within your website or CMS. How you tell? Double-check the origin of the crawl error. If there's a broken link on your site, in your page's static HTML, then that's always worth fixing. (thanks +Martino Mosna)
What about the funky URLs that are “clearly broken?” When our algorithms like your site, they may try to find more great content on it, for example by trying to discover new URLs in JavaScript. If we try those “URLs” and find a 404, that’s great and expected. We just don’t want to miss anything important (insert overly-attached Googlebot meme here). support.google.com/webmasters/bin/answer.py?answer=1154698 You don’t need to fix crawl errors in Webmaster Tools. The “mark as fixed” feature is only to help you, if you want to keep track of your progress there; it does not change anything in our web-search pipeline, so feel free to ignore it if you don’t need it.
support.google.com/webmasters/bin/answer.py?answer=2467403 We list crawl errors in Webmaster Tools by priority, which is based on several factors. If the first page of crawl errors is clearly irrelevant, you probably won’t find important crawl errors on further pages.
googlewebmastercentral.blogspot.ch/2012/03/crawl-errors-next-generation.html There’s no need to “fix” crawl errors on your website. Finding 404’s is normal and expected of a healthy, well-configured website. If you have an equivalent new URL, then redirecting to it is a good practice. Otherwise, you should not create fake content, you should not redirect to your homepage, you shouldn’t robots.txt disallow those URLs -- all of these things make it harder for us to recognize your site’s structure and process it properly. We call these “soft 404” errors.
support.google.com/webmasters/bin/answer.py?answer=181708 Obviously - if these crawl errors are showing up for URLs that you care about, perhaps URLs in your Sitemap file, then that’s something you should take action on immediately. If Googlebot can’t crawl your important URLs, then they may get dropped from our search results, and users might not be able to access them either.
The comment from Martino Mosna that John edited in is:
...a lot of 404 may suggest a structural issue, usually common in badly designed CMS, that can effectively harm rankings. Of course it depends on the URLs reported...
So it appears that John does agree that many internal broken links could hurt your rankings.
Google doesn't want to send visitors to sites that don't have a good user experience. Google may even specifically algorithmically penalize sites with lots of internal broken links.
At the very least, users may go back to the Google results when they encounter error pages on your site. Google pays attention to when users click back to the search results. Many users doing that certainly hurt rankings.
As closetnoc states in the comments, a few broken links are not likely to hurt you. Most sites have some. I'd only worry if you have many or users run into them frequently.
Google has some proof at this URL: support.google.com/webmasters/answer/35769#technical_guidelines
It states:
Every page should be reachable from at least one static text link.
And because tons of people use google, not following their guidelines is a good way to lose search engine ranking. On top of that, google prefers webmasters to make websites that offer a good user experience, and pages littered with advertisements (especially pop-ups) create a bad experience, and broken links create an even more terrible experience, especially if accessing that link is a requirement to gain access to a commonly used section of the site.
Unless you decided to make a site for robots only, having broken links on popular pages will likely result in you receiving complaints from your website guests.
I'd recommend using webmaster tools either with Bing or Google. Submit a sitemap to it and let it crawl your entire website. In a few days or so, it will report all broken links to you so you know where to start.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.