Mobile app version of vmapp.org
Login or Join
Annie201

: When you move a site via a 301 redirect should you setup a robots.txt disallowing robots to crawl the old address? A site I am working on moved a subdomain to another subdomain via a 301

@Annie201

Posted in: #Redirects #RobotsTxt #Seo

A site I am working on moved a subdomain to another subdomain via a 301 redirect. However when checking robots.txt of the old subdomain, it has a robots.txt which disallows search engine web crawlers to crawl it.

Is this the right move? I believe no because the crawlers won't be able to crawl the old site to see the 301 status codes.

10.03% popularity Vote Up Vote Down


Login to follow query

More posts by @Annie201

3 Comments

Sorted by latest first Latest Oldest Best

 

@Megan663

I would keep the robots.txt as is, but make sure that the old and new pages have the appropriate version of Canonical tags. This will help the search engine make sense of what has gone on.

10% popularity Vote Up Vote Down


 

@Welton855

Your suspicions are right, with the exact reasoning that you mentioned. If you disallow robots from accessing the old site, they won't be able to see the 301 redirect.

10% popularity Vote Up Vote Down


 

@Holmes151

I believe no because the crawlers won't be able to crawl the old site to see the 301 status codes.


Yes, exactly - this is not the "right move". If you implement a 301 redirect, presumably to preserve SEO, then blocking the URLs that are being redirected is going prevent the search engines from seeing the redirect and indexing the new content.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme