Mobile app version of vmapp.org
Login or Join
LarsenBagley505

: Will Google remove links restricted by the robots.txt from the soft 404 list? I have updated a shop system to a newer version. I now get huge number of soft 404 errors in the Webmaster Tools

@LarsenBagley505

Posted in: #CrawlErrors #Googlebot #Indexing #RobotsTxt #Soft404

I have updated a shop system to a newer version. I now get huge number of soft 404 errors in the Webmaster Tools because of links that have been indexed in the old version of that shop.

I could lock Google out from these links by adding some entries to the robots.txt. In this case will the already indexed links be removed from that soft 404 list or will they stay on that list forever because the status of the entries cannot be verified by Google?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @LarsenBagley505

1 Comments

Sorted by latest first Latest Oldest Best

 

@Angela700

I wouldn't worry too much because 404 errors are not penalized. It basically represents files that have been temporarily hidden. If removed, it will clear itself from the search. I usually suggest not blocking Google for any reasons.

As you will see Google recommends 404 pages and noindex tag opposed to blocks.

Make removal permanent'

The Remove URLs tool is only a temporary removal. To remove content or a URL from Google search permanently you must take one or more of the following additional actions:


Remove or update the actual content from your site (images, pages,
directories) and make sure that your web server returns either a 404
(Not Found) or 410 (Gone) HTTP status code. Non-HTML files (like
PDFs) should be completely removed from your server. (Learn more
about HTTP status codes)
Block access to the content, for example by requiring a password.
Indicate that the page should not to be indexed using the noindex
meta tag. This is less secure than the other methods.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme