Mobile app version of vmapp.org
Login or Join
Angela700

: Can creating a duplicate site for testing and upgradation can cause duplicate content Considering upgrade of CMS to latest version- the process may take anywhere from 2 to 4 weeks. May suggest

@Angela700

Posted in: #Cms #DuplicateContent #Seo #WebHosting

Considering upgrade of CMS to latest version- the process may take anywhere from 2 to 4 weeks.

May suggest - we need to work on live server, what we have planned:


take exact replica of site and move to a test domain, but on live server
Block Google, Bing, Yahoo - User-agent: Google Disallow: /, User-agent: Bing Disallow: /, User-agent: Yahoo Disallow: / in robots.txt
Will upgrade CMS and add functionality - will test the entire structure, check URLusing xenu and move on to configure the site on original domain


The process upgradation and new tools may take 2 to 4 weeks....

Concern is that despite blocking Google, Bing & Yahoo through

User agent disallow


can still the URLof test site can be crawled by the search engines - if yes - it may hurt the original site as will read on as entire duplicate or is there any alternate way around.

10.03% popularity Vote Up Vote Down


Login to follow query

More posts by @Angela700

3 Comments

Sorted by latest first Latest Oldest Best

 

@Cofer257

John is correct about adding authentication. One alternative is to restrict the site to specific IPs (e.g. the IP of your office), and serve a 404 or 403 error to others.

Also, with regards to robots.txt, you should block all bots, which is much simpler:

User-agent: *
Disallow: /


Even if you only block via robots.txt, there is no downside for SEO. The test site will not be crawled. If any individual page has a link to it from elsewhere that page may be crawled but never the whole site.

10% popularity Vote Up Vote Down


 

@Correia994

You are describing a staging site. As you suggest, you will replicate the site and its production environment locally and to a staging site for development and testing.

Also, as you have mentioned, you may restrict access to the staging site by filtering by UserAgent and also with robots.txt. I recommend blocking access to this staging site using an Authentication method, such as Basic Auth as mentioned by OP.

You may wish to research a Development -> Staging -> Production workflow. Sounds like what you are trying to achieve.

10% popularity Vote Up Vote Down


 

@Pope3001725

You must make sure that this site is unavailable to search engines. The best way to do this is to put the entire dev site behind a login mechanism. Basic Authentication is the simplest and most effective way to do this. It makes it impossible for search engines to crawl but is easy for developers/users to access as they only need to enter a password once per visit.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme