Mobile app version of vmapp.org
Login or Join
Correia994

: SEO: Similar, but not duplicate, content across several subdomains I have a website that allows users to find/submit tips for doing specific fitness activities. The site is structured as "[activity].example.com"

@Correia994

Posted in: #Seo #Subdomain

I have a website that allows users to find/submit tips for doing specific fitness activities.

The site is structured as "[activity].example.com"


example.com
swimming.example.com
lifting.example.com


Each subdomain is meant to be a (partially) closed-off community specific to that activity. For example, the swimming subdomain is where all swimming tips are created and shared.

However, some tips appear on all of the subdomains (for example, a "stretching" tip would most likely appear on all activity pages). The top tips may also be aggregated to the root domain. This has been happening quite a bit, and I'm wondering what the SEO implications are.

One example of a site that has a similar setup is reddit. They have their site set up to 301-redirect "[subreddit].reddit.com" to "reddit.com/r/[subreddit]". Presumably this is for SEO purposes.

Am I being hurt SEO-wise by having these subdomains? I.e. is Google seeing them as separate sites with duplicate content? Should I structure the site more like reddit with the sub-directory structure instead?

I know questions related to this one have been answered, but those are more towards exact duplicate content.

Thank you in advance for your help.

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Correia994

2 Comments

Sorted by latest first Latest Oldest Best

 

@Sarah324

As far as Google is concerned there is no difference between subdomains and subdirectories. So they are definitely not seen as separate sites. Additionally, the concept of website is irrelevant as far as duplicate content goes as the original page, regardless of origin, is left in the search results while all others are filtered out (or so that is their goal).

It's difficult to say with any authority or specificity exactly how much content must be duplicated for a page to be consider duplicate content and filtered out. Google has not offered any specific rules for how they determine this (for obvious reasons) and no one has done any kind of study that offers strong evidence towards what percentage of content must be duplicate or other factors contribute to the determination that content is duplicated.

Obviously the more content on a page is duplicated the more likely it runs the risk of being filtered out as duplicate content. But since that same content is already on your website that's not a big deal. It will still be found on whatever pages Google decides is the original and list in its search results. But if you want to do your best to prevent this from happening try to ensure that a certain percentage, let's say 50%, of your pages is unique and not this content that can be duplicated across pages. So basically try to avoid making entire pages out of this dynamic and possibly duplicated content.

10% popularity Vote Up Vote Down


 

@Murray432

For SEO you could gather the preferred version of the content into one place for each topic and use the best strategy available (probably 301 redirects) to show where the preferred version of the content lives. Canonical tags might work (depending on the content similarities) in some cases, but there is no guarantee, you would have to test. With google's most recent algorithm update this month, seemingly aimed at hurting content farms, it wouldn't hurt to only have one preferred version, so you don't look like a content farm.

Make sure to read this, it has good tips for your situation. I know you're saying it's similar not dupe, but still a good resource if you haven't read it: googlewebmastercentral.blogspot.com/2009/10/reunifying-duplicate-content-on-your.html
Editted, updated answer about canonical tags based on comments from John Conde.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme