Mobile app version of vmapp.org
Login or Join
Looi9037786

: Prioritizing duplicate/syndicated content rather than original The Context I work for a company that services the PR and communications industry on of variety of levels. Our direct clients use

@Looi9037786

Posted in: #DuplicateContent #Seo #Syndication

The Context

I work for a company that services the PR and communications industry on of variety of levels. Our direct clients use us as a content creation agency, to help integrate their brand into lifestyle media. Part of our services include the distribution/amplification of this content. We do this by serving it to journalists and bloggers (copyright free).

Here's where the problem comes in...

Because this content is now duplicated (or syndicated) across Canadian media, Google treats it as duplicate content. Our more web savvy clients are now concerned that they will be penalized for using our content rather than help by increasing content on their site. I understand that Google does not penalize for duplicate content but it still credits the original content as coming from my company rather than the journalists and bloggers that we would like to benefit.
Does anyone have any advice or strategies to handle this "reverse" SEO challenge?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Looi9037786

1 Comments

Sorted by latest first Latest Oldest Best

 

@Goswami781

Sounds like an issue that should be handled with robots directives.

If you don't want search engines to rank you for it, why let them crawl it?

Google takes canonicals as a "strong hint," if your site publishes content, lets search engines index and crawl it, and then expects other sites that republish it unchanged to rank for it, I suspect you'll be disappointed as they may ignore the canonical link.

However, if you use robots.txt to forbid Google from crawling that part of your site, it won't be an issue as Google won't ever see it on your site.

While I agree, that canonicals may work, they also may not. Preventing crawling via robots directives almost certainly will.

I doesn't sound like your "web savvy" clients are very savvy.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme