Mobile app version of vmapp.org
Login or Join
Radia820

: How to fix thousands of 404's? The business I work for recently migrated our e-commerce site from Celerant to Magento. Despite using a robots.txt file and a 301 redirect extension, webmaster

@Radia820

Posted in: #GoogleSearchConsole #RobotsTxt

The business I work for recently migrated our e-commerce site from Celerant to Magento. Despite using a robots.txt file and a 301 redirect extension, webmaster tools is reporting that our "not found" errors are increasing by the day- actually 200,000+, all from our old site.

I've tried 301 redirects which don't seem to be working and then blocking the URLs with the robots.txt with Disallows including wildcards but the errors continue to increase. Any advice would be greatly appreciated. Here is an example of one of our 404 URLs.

This is how we tried to block it in our robots.txt. Is this correct?

Originally we used:

Disallow: /category.cfm


A few days ago we decided to try:

Disallow: /*category

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Radia820

1 Comments

Sorted by latest first Latest Oldest Best

 

@Heady270

Your original syntax for robots.txt was the correct syntax: Disallow: /category.cfm Webmaster Tools reports things that Googlebot may have crawled some time ago as they get processed, so your 404 errors could grow even after you are not allowing Googlebot to recrawl the urls.

301 redirects really are the correct way to handle these types of things. In the one example you gave for the paris brand page, your search engine seems to do a decent job of handling that: boardshop.bcsurf.com/search?w=paris I'd recommend 301 redirecting that brand page to your search engine.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme