Mobile app version of vmapp.org
Login or Join
XinRu657

: Moving directory to domain and maintaining SEO I recently completed a website for a client, which was hosted on my own personal server until we were ready to move it to their own dedicated

@XinRu657

Posted in: #Redirects #Seo #Subdirectory

I recently completed a website for a client, which was hosted on my own personal server until we were ready to move it to their own dedicated domain. Inadvertently, the sub-directory got indexed when it still resided on my domain, and thus searches for the site show up for its existence on my server.

I set up a 301 Permanent Redirect and put in a crawl request to Google hoping to rectify the search results, but unfortunately the results are still the same.

All the literature I have been reading talks about setting up 301s when moving from domain to domain, but little to none talks about moving a site from a sub-directory to its own TLD.

Is there a specific process to go about when moving a site from a sub-directory to its own domain while maintaining its preexisting SEO?

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @XinRu657

2 Comments

Sorted by latest first Latest Oldest Best

 

@Alves908

You have a few options. A 301 redirect, or 404 not found, 410 gone, or block access using robots.txt. Each option depends upon the situation.

If you have links to the sub-domain, then a 301 redirect is a temporary solution to maintain any value of that link. If you are not concerned about links to the sub-domain, then the following options may be better. I caution you that any links to the sub-domain will have to dealt with eventually. You have to decide if any of the links have enough value to preserve for a period. If they have value you do not want to lose, then a 301 redirect can help with this. Keep in mind that 301 redirects only have value as long as they continue to exist. In the end, they will likely have to be broken so that the sub-domain can be taken down if that is the plan.

If there are no links to the sub-domain, then there are better options that would help resolve your issue faster.

Any 301 redirect would preserve links. If there are no links to be concerned with, then a 301 redirect would preserve both sets of URLs in the index: the sub-domain URLs and the (proper) domain URLs. Which does not seem to be what you want and will slow down any bubbling-up of the (proper) domain in the SERPs.

A 404 not found error could remove the sub-domain pages from the index, but it will take a while because you are in a sense, not telling a searcher or spider that the page is gone. However, a 404 error is naturally an easy thing to do. For a period, Google will try each page repeatedly until Google decides that each page is actually gone. Google uses a TTL (time to live) style metric based upon freshness. In this case, because these are new pages, the TTL may be kind of long. Therefore this could be a longer process.

A 410 gone error could remove the sub-domain pages from the index, but each page has to be accessed again on whatever time schedule that Google has for each page. Google uses a TTL (time to live) style metric based upon freshness. In this case, because these are new pages, the TTL may be kind of long. Therefore this could be a longer process. Still, it would be much shorter a process than a 404 not found error.

Using the robots.txt file to block access to the sub-domain can drop all of the pages from the index much faster. In this case, when the robots.txt file is read again and Google is blocked from the sub-domain site, then Google will drop the indexed pages generally within days. This can still take a while based upon the TTL metric for the site overall. Google will not check the robots.txt file more than once in a 24 hour period. If Google has a shorter TTL value for the sub-domain site, then the robots.txt can be re-read within a few days. Though it can be weeks also. However, once the robots.txt file is read, the process to drop the sub-domain pages from the index can be just a few days.

If there is no advantage to retaining the sub-domain, it is a much faster process to simply use the robots.txt file to block accesses from the sub-domain to drop these pages from the index allowing the (proper) domain to then perform in the SERPs as you intend. IF you are in a hurry, then blocking access to the sub-domain within the robots.txt will be the fastest option.

As far as SEO is concerned, there likely is no value you can preserve outside of links and existing search. The links may be of little value. You have to decide. The primary reason is because it is a new site. I would assume that not enough soaking-in has occurred in the search engines to be of real value. Certainly, your customer will want that value for their own site and breaking the sub-domain is the fair and proper thing to do for your customer. The only realistic value that I can see would exist is search traffic. If it is low, then it costs you almost nothing to break the sub-domain and let the (proper) domain to bubble-up in the SERPs. If this is what you decide is the best option for your scenario, then it should be a relatively fast process in the search engine world, 30-60 days, to begin building search traffic which is quite normal. Remember, depending upon the site, it can take 6-12 months to properly soak-in to the SERPs anyway. If your sub-domain has only existed for a few months, it has not really soaked-in very much at all. Sometimes the best thing to do is in business and SEO is to cut-bait and fish.

10% popularity Vote Up Vote Down


 

@Hamaas447

You need to 301 redirect every single directory URL to its equivalent URL in your client's website. If you don't do it like this then it will never work, and you will lose your PageRank:

yoursite.com/sub/first-page => clientsite.com/new-url-maybe-the-same-for-first-page

yoursite.com/sub/second-page => clientsite.com/new-url-maybe-the-same-for-second-page

After correcting your redirects, you have to wait for some time (i.e., weeks, maybe months) before you'll see the desired results. 301 Redirects don't transfer PageRank immediately, GoogleBot will decide when to re-crawl your site and eventually it will follow the redirects and index them as part of your search results.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme