Mobile app version of vmapp.org
Login or Join
Deb1703797

: Duplicate content on a site I have a site for a client. He has content he has written on a page and it looks great. He now wants some of this content on another page, under another heading

@Deb1703797

Posted in: #DuplicateContent #Seo

I have a site for a client. He has content he has written on a page and it looks great. He now wants some of this content on another page, under another heading that contains a collection of his writings. In my understanding this would be duplicate content for the block that is copied.

I could block the first page but it is built specifically for that piece of writing. The page with the collection of writings also contains unique writings so I do not want to block that page.

Is there anyway to tell Google: "Ignore this div of content but use the rest"?

What is the best route here?

I am thinking of explaining to my client the situation of duplicate content and not adding it unless I can do it without any SEO penalties.

10.03% popularity Vote Up Vote Down


Login to follow query

More posts by @Deb1703797

3 Comments

Sorted by latest first Latest Oldest Best

 

@Ann8826881

Is there anyway to tell Google: "Ignore this div of content but use the rest"?

As covered in a post here, there are two potential ways get Google to ignore part of a page included in other pages:


JavaScript
Google tends to ignore most JavaScript. That means you
could load up the content you want hidden/ignored/discounted in JS.
The problem is - Google "may" still understand it/read it/reference
it. The only really safe way is to include the content in an external
JS file, and block that file in robots.txt. (Alternatively - you could
use the x-ref header of NoIndex for .JS files etc.) Otherwise - if you
have the JS in your file, as inline/embedded content - it may still be
used (unlikely, but possible).

Frames
You could place the content you want hidden/ignored/discounted
in a separate file. You set the robot-meta for that file as NoIndex
(or you could block it with robots.txt or use the x-ref header of
NoIndex). You then load that content into your page using a Frame.
Google will crawl your page, see the frame, look to the framed file -
see it is blocked (noindex/disallowed), and not touch it.


Since the date of this post, Google has become better at crawling and understanding JavaScript, so this statement may not be quite as accurate now: Google tends to ignore most JavaScript.

Of the two choices, I would suggest using an iFrame to a separate source file containing the duplicate content, and then block this source file from being indexed as covered above.

I am thinking of explaining to my client the situation of duplicate content and not adding it unless I can do it without any SEO penalties.

They're likely wouldn't be any "penalties" - if Google finds duplicate content on the same site, it will make a decision as to which page to index, as covered here in Google Webmaster Tools - Duplicate Content:


Google tries hard to index and show pages with distinct information.
This filtering means, for instance, that if your site has a "regular"
and "printer" version of each article, and neither of these is blocked
with a noindex meta tag, we'll choose one of them to list.


If you're concerned with getting both pages indexed, then the above might help. If you're not worried about Google choosing which page to index, then there's not as much reason to be concerned.

10% popularity Vote Up Vote Down


 

@Berumen354

You can tell Google to ignore that part of that page by using Google Off and On crawling.

Here is how - perishablepress.com/tell-google-to-not-index-certain-parts-of-your-page/

10% popularity Vote Up Vote Down


 

@Sarah324

There is no way to tell Google to ignore part of a page. But in this case it's not really a big deal. It's not uncommon to have the same text on multiple pages as there are perfectly legitimate examples of this being necessary and helpful to users. And in cases like this where it seems to infrequently it won't be considered low quality content. I wouldn't be concerned about this at all.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme