: Search and Tags for robots.txt After setting up a blog at blogspot.com I added the site to my webmasters tools of google. I noticed an error: "Restricted by robots.txt" and looked a little
After setting up a blog at blogspot.com I added the site to my webmasters tools of google.
I noticed an error: "Restricted by robots.txt" and looked a little into the matter. I found in the robots.txt of blogspot that Google prevents the /search directory by default in order to avoid duplicate entries in their search engine. As tags can be found under /search/label/SOME_TAG, these are not indexed too.
I run different sites, especially one is an important e-commerce site for me. For each product, we use tags. Each tags leads to a separate site like /tags/tag1/ and lists all products that are linked to this tag.
And this leads me to my question:
Should I also block search and/or tag pages on my sites by using robots.txt?
I feel google might punish my pagerank/results for using this "low-quality" content. However I think they are quite useful. We provide a short description of each tag and how the listed products can be used for the problem the tag describes.
Moreover users landing on tags-pages have very high bounce rates (>90%), which is far above the average bounce rate.
So, what is the best practice?
More posts by @BetL925
1 Comments
Sorted by latest first Latest Oldest Best
I find a lot of pages in Google's search results from the various StackExchange sites that are the tag pages, most popular questions, etc. So it doesn't look like Google considers them to be low quality. I would leave them unblocked unless you see some sort of problem such as a Panda update having a very negative effect on your rankings. Otherwise these pages are a source of traffic, and although the bounce rate is high, you are receiving visitors who are going further into your site thanks to these pages being in the search results.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.