Mobile app version of vmapp.org
Login or Join
Annie201

: Can the x-robots instruction noindex be good enough to declare duplicate content? I still have a feeling that at least one search engine in the entire world does not support rel-canonical like

@Annie201

Posted in: #DuplicateContent #Noindex #RelCanonical

I still have a feeling that at least one search engine in the entire world does not support rel-canonical like google does.

On top of that, rel-canonical offers less flexibility especially when dealing with sets of similar content where each set has many unequal pages and especially when I don't want to use a view-all option (because my site is mostly image based).

My question then is, can I entirely cancel the rel-canonical option and just apply no-index x-robots meta tag to all the pages in the duplicate sets or would google think that noindex means I'm just trying to hide something and not making a strong enough effort to declare original content?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Annie201

1 Comments

Sorted by latest first Latest Oldest Best

 

@Megan663

Noindex is a fine option for dealing with duplicate content. I have used:


robots.txt -- Prevents Googlebot from finding the content to begin with, saves bandwidth. Google may end up indexing some pages that have external links, but only with inbound anchor text. Won't pass PageRank from external links.
meta robots noindex tag -- Allows Googlebot to crawl the page, but instructs Google not to index it. Won't pass PageRank from external links very effectively.
meta rel canonical tag -- Supported by all the major search engines. Pushes pagerank to page 1 of pagination.
Just let it all be crawled and indexed. Google does not penalize for duplicate content within your own site (as opposed to content copied from elsewhere). Google just picks a page to index. They may pick a URL that you don't prefer. The only problems arise when there is so much duplication that Googlebot has trouble crawling your site because of the volume of duplicate URLs.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme