Mobile app version of vmapp.org
Login or Join
Heady270

: How do I stop Soft 404 errors from piling up for "No Results" page? Recently, I noticed that I have a growing list of Soft 404 errors in Google Webmaster Tools. They are all for dynamically

@Heady270

Posted in: #GoogleSearchConsole #Noindex #Soft404

Recently, I noticed that I have a growing list of Soft 404 errors in Google Webmaster Tools. They are all for dynamically generated search result pages that report "No matches found".

I do understand what Google means by Soft 404 and why they are reporting it for these pages. So I added <meta name="robots" content="noindex"> to these pages.

However, Google is still reporting new Soft 404 errors for pages that are using the noindex meta tag.

Why does Google report any error for a page I told them not to index?

The problem is that with all these unwanted errors, I can't see if there are any real problems that need to be fixed.

Some have said that these pages should return a 404 status code. But that just shifts the problem over to the 404 errors tab. Besides, Google returns status code 200 for their no results page.

I don't want to block access with robots.txt because I want the links on these pages followed and I want Google to see the noindex meta tag. Besides, there is no pattern that I could use to block these.

Google found these URLs in the first place because the content used to exist, but has since been deleted. I can't return a 410 status code, because my PHP code has no way of knowing the reason why no results were found.

Is there anything I can do to make it easier to see the real problems?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Heady270

1 Comments

Sorted by latest first Latest Oldest Best

 

@BetL925

You should prevent Google from crawling site search pages. Google doesn't want to crawl your site search at all. Here is Google's Matt Cutts blog post about the issue: Search results in search results by Matt Cutts on March 10, 2007. Google now actively penalizes sites that allow their site search results to be crawled and appear in Google's SERPs. By allowing Googlebot to crawl your search result pages, you are risking all of your Google referral traffic. One favorite trick of a Google reviewer is to use your site search for a spam terms such as "Viagra". When they see a crawlable page as the result (even if it says "no results for Viagra found") they will apply a manual penalty against your site as a spam site.

You should put your site search into robots.txt. Just make sure that Googlebot can still crawl your content pages. You will then stop getting new soft 404 errors reported.



A large number of 404 errors (even soft 404 errors) do not hurt your site's rankings. Google reports errors on any page they can find and crawl, whether or not you want it indexed, and whether or not you even link to it. They do this because the error reports are solely for your benefit and they feel like you should be fully informed.

Here is what Google's John Mueller has to say about it:



404 errors on invalid URLs do not harm your site’s indexing or ranking in any way. It doesn’t matter if there are 100 or 10 million, they won’t harm your site’s ranking. googlewebmastercentral.blogspot.ch/2011/05/do-404s-hurt-my-site.html
In some cases, crawl errors may come from a legitimate structural issue within your website or CMS. How you tell? Double-check the origin of the crawl error. If there's a broken link on your site, in your page's static HTML, then that's always worth fixing. (thanks +Martino Mosna)
What about the funky URLs that are “clearly broken?” When our algorithms like your site, they may try to find more great content on it, for example by trying to discover new URLs in JavaScript. If we try those “URLs” and find a 404, that’s great and expected. We just don’t want to miss anything important (insert overly-attached Googlebot meme here). support.google.com/webmasters/bin/answer.py?answer=1154698 You don’t need to fix crawl errors in Webmaster Tools. The “mark as fixed” feature is only to help you, if you want to keep track of your progress there; it does not change anything in our web-search pipeline, so feel free to ignore it if you don’t need it.
support.google.com/webmasters/bin/answer.py?answer=2467403 We list crawl errors in Webmaster Tools by priority, which is based on several factors. If the first page of crawl errors is clearly irrelevant, you probably won’t find important crawl errors on further pages.
googlewebmastercentral.blogspot.ch/2012/03/crawl-errors-next-generation.html There’s no need to “fix” crawl errors on your website. Finding 404’s is normal and expected of a healthy, well-configured website. If you have an equivalent new URL, then redirecting to it is a good practice. Otherwise, you should not create fake content, you should not redirect to your homepage, you shouldn’t robots.txt disallow those URLs -- all of these things make it harder for us to recognize your site’s structure and process it properly. We call these “soft 404” errors.
support.google.com/webmasters/bin/answer.py?answer=181708 Obviously - if these crawl errors are showing up for URLs that you care about, perhaps URLs in your Sitemap file, then that’s something you should take action on immediately. If Googlebot can’t crawl your important URLs, then they may get dropped from our search results, and users might not be able to access them either.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme