Mobile app version of vmapp.org
Login or Join
Bryan171

: Running live site side by side with it's replacement Our current site has become outdated, it's in classic ASP. We have recently redesigned the site using Magento. Basically all the URLs are

@Bryan171

Posted in: #SearchEngines #Seo

Our current site has become outdated, it's in classic ASP. We have recently redesigned the site using Magento. Basically all the URLs are different, however the content is about the same. We did a soft roll out of the new site, along side the current site, making it a sub-domain using www1. We are directing roughly 8% of our current traffic to the new site.

I found a good article on how to execute the full cutover:
How to Avoid SEO Disaster During a Website Redesign

My concern right now is that the sites have been running side by side for over a month now. I had planned to be cutover by now. The new site has a new GA profile but I haven't submitted it to any search engines as the plan is to give it the www domain when the cutover happens. However the www1 is being crawled already. Probably because the bots got redirected from the current site.

My question is if I'm going to start running into trouble with search bots the longer these two sites run side by side? IE. duplicate content, competing rankings, etc. Unfortunately the cutover has be delayed until some other important, unrelated projects are taken care of.

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Bryan171

1 Comments

Sorted by latest first Latest Oldest Best

 

@Sherry384

It sounds like you could, if not now, soon enough.

I have never been in favor of running two public copies of any site. I understand the need for a development copy at least for test, but always make sure the robots.txt file restricts all spiders access to the new site while both sites remain accessible.

Setting aside 301 redirects to retain link value and other considerations, it is always better to update any site all-mass if it is a complete replacement. The content does not have to change much, but if the structure changes, that is enough to update the site and create the 301 redirects any other necessities quickly. Yes there are slower less painful methods, however, I have never been sure it is worth the extra work avoiding any pain. Google and Bing are very flexible and if you help them with 301 redirects and sitemaps, any large scale update can be successful.

To answer your question: If it is going to be a while before any real update is done, then it is probably best to block access/indexing using the robots.txt file. During this period, you can check any 301 redirects and other considerations before making the site live. If you only have a week or so, then it may not make enough of a difference to worry too much. Still, it is a personal call and erring on the side of caution could be a win win. Any dip in performance has already likely happened as the index now has both sites listed. The dip can grow worse of course, but would clear up quickly (30-60 days) once the site update has been done providing that you do not wait too long.

It's a personal call really. I would use robots.txt if it were my site and make the update all-mass when done. The reason is simple- the recover would begin and at least that time will not be wasted.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme