Mobile app version of vmapp.org
Login or Join
Kaufman445

: Robot.txt can get all soft404s fixed? I got many soft404 in Google webmaster Tools, and those webpages aren't existing any more. thus I am unable to insert <meta name="robots" content="noindex,

@Kaufman445

Posted in: #Soft404

I got many soft404 in Google webmaster Tools, and those webpages aren't existing any more. thus I am unable to insert <meta name="robots" content="noindex, nofollow"> into my pages, and I've been searching a while but didn't get some valuable clues.

There are about 100 URLs are soft 404, to redirect them all one by one is a bit silly as it would cost too much time for me.

If i just add those links into robot.txt like below

User-agent: *
Disallow: /mysite.asp
Disallow: /mysite-more.html


if this way will fix all soft404s solidly? or if there is a way to change all soft404 to hard404? Please give me some suggestions. Many thanks

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Kaufman445

1 Comments

Sorted by latest first Latest Oldest Best

 

@Nimeshi995

Wow. I think I got what you are saying, but there are some missing pieces so please bare with me.

Soft 404 errors are often something really stupid like having the phrase not found in your content. That drives me nuts! It happens to me all the time. It cannot occur unless Google sees a page and (likely) is getting a page with not found but no actual 404 response header. Something is interrupting the normal 404 error process. This we know because you tell us that the pages do not exist.

I am not sure how a non-existing page can have soft 404 unless it is being redirected or triggering a custom 404 error that is not working properly. If this is the case, then remove the redirect or custom 404 and let a real 404 happen. It may take take about 30 days or so for Google to stop looking for these pages. But it will clear up in time.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme