Mobile app version of vmapp.org
Login or Join
Heady270

: CDN virtual subdomain causes duplicated content I have created a subdomain and a CNAME record which points to the domain root. The subdomain www.static.example.com is actually a copy of the entire

@Heady270

Posted in: #Cdn #Htaccess #RobotsTxt #Seo #Subdomain

I have created a subdomain and a CNAME record which points to the domain root. The subdomain static.example.com is actually a copy of the entire website example.com and it is supposed to act as an CDN and serve static content in order to improve speed. However, all of my content can be accessed via subdomain aswell, so Google has indexed it all and now I am dealing with duplicated content.

How could I deny access to crawlers for the subdomain baring in mind that I do not have different subfolder for subdomain, so I can't create a separate robots.txt file?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Heady270

1 Comments

Sorted by latest first Latest Oldest Best

 

@Heady270

One of the simplest things that you can do is set the rel canonical link tag on each of your pages. The canonical can point to itself on your main domain. When the page is served through the CDN, it would point to the live site. This would prevent search engines from indexing the HTML pages served through the CDN.

CDNs act like caching proxy servers. When a request hits them, they check to see if they have the resource. If they don't the have to fetch it from your website before they cache it and serve it. When they do that fetch, they should send some indication to your web server that the fetch is for the CDN. This may be a specific user agent string or some other header. Check with your CDN, or log an entire request sent through the CDN at a new URL that hasn't been fetched yet. Once you know this information, you could configure your server to serve a different robots.txt file that will be used only by the CDN.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme