Mobile app version of vmapp.org
Login or Join
Harper822

: To disallow indexing the category and tag listings in a blog Mark Wilson says that category and tag listings in a blog should be disallowed in order to prevent duplicate content. I understand

@Harper822

Posted in: #Links #RobotsTxt #Seo

Mark Wilson says that category and tag listings in a blog should be disallowed in order to prevent duplicate content. I understand this.

However, I want to put internal links on keywords in the blog posts to the tag and category pages in order for the readers to find more relevant content.

I wonder whether putting those internal links to the category/tag pages which are disallowed in robots.txt is counted as useful from the perspective of SEO internal linking?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Harper822

1 Comments

Sorted by latest first Latest Oldest Best

 

@Jessie594

Google Webmaster Central says


Google no longer recommends blocking
crawler access to duplicate content on
your website, whether with a
robots.txt file or other methods.


So then I don't need to disallow crawling of category/tag listings in my blog. Duplicate content is penalized only when used for malicious purposes:


In the rare cases in which Google
perceives that duplicate content may
be shown with intent to manipulate our
rankings and deceive our users, we'll
also make appropriate adjustments in
the indexing and ranking of the sites
involved. As a result, the ranking of
the site may suffer, or the site might
be removed entirely from the Google
index, in which case it will no longer
appear in search results.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme