Mobile app version of vmapp.org
Login or Join
Tiffany637

: Why does Google report a soft 404 when I redirect to the signup page? In the last month, I've got an increased number of "soft 404" errors reported by Google webmaster tools which actually

@Tiffany637

Posted in: #Authentication #GoogleSearchConsole #RubyOnRails #Soft404

In the last month, I've got an increased number of "soft 404" errors reported by Google webmaster tools which actually work well for users.

Configuration (maybe useless):


I have a website built with rails 3.1
Authentication is handled by the gem Devise


Problem:


On this page en.bemyboat.com/yacht-charter/9965-sailboat-beneteau-oceanis-43 Click on "Ask a Boat request" (a simple form, in GET to: en.bemyboat.com/boat_requests/new/9965) You are redirected with the HTTP status 302 to sign in
You are then sent back to the new page if successfully sign in.


Google tells me that the link on "ask a boat request" returns a soft 404.

I can't make this form in "POST" (which will solve the problem) because we need to automatically redirect users back to the page after sign in. (the Gem Devise memorizes the "get" link.)

To simplify, the question is:
How to protect a private page with authentication, reached with a simple "GET" and not to be penalized by Google as a "soft 404".

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Tiffany637

2 Comments

Sorted by latest first Latest Oldest Best

 

@BetL925

Soft 404s are not a "penalty". Google reports them to you so that can evaluate whether or not they are actually problems. Google's John Mueller has this to say about 404 errors:


404 errors on invalid URLs do not harm your site’s indexing or ranking in any way. It doesn’t matter if there are 100 or 10 million, they won’t harm your site’s ranking. googlewebmastercentral.blogspot.ch/2011/05/do-404s-hurt-my-site.html

So there is no need to fix them if they are not actually problems.



Having said that, you don't want bots (including Googlebot) to submit your forms. You should block bots from crawling your form submission URLs using robots.txt:

User-Agent: *
Disallow: /boat_requests/

10% popularity Vote Up Vote Down


 

@Angela700

You could try applying the nofollow link relationship to the form's action URL and prevent Google from even "following" the URL - if you can't view the page unless you're logged in then there is no reason for a search engine to visit there anyway.

You could also apply noindex on en.bemyboat.com/boat_requests/* for example also.

More information on nofollow and how Google treat this.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme