Mobile app version of vmapp.org
Login or Join
Reiling115

: Ranking drop after using reverse proxy for blog subdirectory and robots.txt for old blog subdomain We have a 3Dcart store and a WordPress blog hosted on a separate server. Originally, we had

@Reiling115

Posted in: #Google #GoogleSearch #ReverseProxy #Wordpress

We have a 3Dcart store and a WordPress blog hosted on a separate server. Originally, we had a CNAME set up to point the blog to blog.example.com/. However, in our attempt to boost link-based and traffic-based authority on the main site, we've opted to do a reverse proxy to www.example.com/blog/.
It’s been about two months since we finished the reverse proxy migration. It appears that everything is technically working as intended, including some robots and sitemap changes; the new URLs are even generating some traffic, as indicated on Google Analytics.

While Google has been indexing the new URL locations, they’re ranking very poorly, even for non-competitive, long-tail keywords. Meanwhile, the old subdomain URLs are still ranking mostly as well as they used to (even though they aren’t showing meta titles and descriptions due to being blocked by robots.txt).

Our working theory is that Google has an old index of the subdomain URLs, and is considering the new URLs to be duplicate content, since it’s being told not to crawl the subdomain and therefore can’t see the rel canonicals we have in place. To resolve this, we’ve updated the subdomain’s robot.txt to no longer block crawling and indexing. Theoretically, seeing the canonical tag on the subdomain pages will resolve any perceived duplicate content issues.

In the meantime, we were wondering if anyone would have any other ideas. We are very concerned that we’ll be losing valuable traffic, as we’re entering our on season at the moment.

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Reiling115

1 Comments

Sorted by latest first Latest Oldest Best

 

@BetL925

You are correct that the robots.txt file is the problem. When you blocked your old subdomain with robots.txt:


Google couldn't crawl your blog in its old location and see that it moved.
You lost all the value from the external inbound links to your blog.


Your new solution of removing the robots.txt file and using a link rel canonical tag is a much better idea. Now:


Google will crawl the old blog and find out that it moved.
Your PageRank from external links into your old blog location will get correctly attributed to the new blog location.


The only problem is that your site was unavailable to Google for two months. When a site is unavailable for a long period of time, Google often removes its rankings for a period of time, even after it comes back. I've seen that before with badly applied nofollow meta tags, and with extended server downtime.

In the best case, your blog will be ranking again in a couple weeks when Google recrawls all of it. I wouldn't be surprised if it doesn't rank well for two to six months due to its extended unavailability.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme