Mobile app version of vmapp.org
Login or Join
Angela700

: Restrict google from indexingcertain content on my page I have an ecommerce shop with ~10 000 items. I wrote parser to load this items to my website from another website. But items description

@Angela700

Posted in: #Google #Seo #Yandex

I have an ecommerce shop with ~10 000 items.

I wrote parser to load this items to my website from another website.
But items description is not unique. So I am afraid that google can bun my website.

So I have question how can I prevent google from indexing certain parts of my website? I will rewrite the description later but it will take to much time.

So i found this question productforums.google.com/forum/?hl=en#!topic/webmasters/Z5JP3UQCVWg
And I can see 3 solutions.


Create an Iframe where I can load description. And use meta noindex tag.
Load content with js ( I read this at the link provided).
So tell me more about it?
Does content loaded with js from external file being not indexed by Google?
(I mean I have div and if write something like $(div).html('not indexed content') will it work?
I found this link just now Preventing robots from crawling specific part of a page


So basically it says I should have a div style =" display:none"
And then I should remove this display:none with js. This seems to be the easiest way to do it.
Does it really work?
Will it be working not in google but in other search engines(I am worried only about yandex.ru actuaaly)

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Angela700

1 Comments

Sorted by latest first Latest Oldest Best

 

@Jessie594

In response to your key question point, can Googlebot be restricted from indexing certain portions of a page only the answer is not at this time. The only supported methods at this time for controlling indexing apply to full pages such as adding rel="nofollow" to certain links and using your robots.txt file to deny access to certain resources.

As for what @SimonHayter says it is very common with e-commerce sites for product descriptions to be the same due to the fact that many e-commerce sites take the product descriptions from the manufacturer and so many sites can have duplicated content for the product description. Usually these are only one or two paragraphs in size and so not a big deal as long as the rest of your site is unique. The Googlebot is intelligent enough to identify e-commererce pages based on content and be more forgiving to product descriptions being duplicate content.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme