: When you move a site via a 301 redirect should you setup a robots.txt disallowing robots to crawl the old address? A site I am working on moved a subdomain to another subdomain via a 301
A site I am working on moved a subdomain to another subdomain via a 301 redirect. However when checking robots.txt of the old subdomain, it has a robots.txt which disallows search engine web crawlers to crawl it.
Is this the right move? I believe no because the crawlers won't be able to crawl the old site to see the 301 status codes.
More posts by @Annie201
3 Comments
Sorted by latest first Latest Oldest Best
I would keep the robots.txt as is, but make sure that the old and new pages have the appropriate version of Canonical tags. This will help the search engine make sense of what has gone on.
Your suspicions are right, with the exact reasoning that you mentioned. If you disallow robots from accessing the old site, they won't be able to see the 301 redirect.
I believe no because the crawlers won't be able to crawl the old site to see the 301 status codes.
Yes, exactly - this is not the "right move". If you implement a 301 redirect, presumably to preserve SEO, then blocking the URLs that are being redirected is going prevent the search engines from seeing the redirect and indexing the new content.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2025 All Rights reserved.