Mobile app version of vmapp.org
Login or Join
Bryan171

: Dynamic URLs are Restricted by Robots.txt; does it really matter for SEO? Today, I was searching for Robots.txt on Google and I found this such a great thread regarding it. I checked Google

@Bryan171

Posted in: #RobotsTxt #Seo

Today, I was searching for Robots.txt on Google and I found this such a great thread regarding it. I checked Google webmaster tools and found that, there are 117 URLs are restricted by Robots.txt. That number will increase in near future because, it's all about dynamic URLs. You can know more by this excel sheet. If I will restrict dynamic URLs which are associated to my category pages so will it create any negative impact on organic SEO?

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Bryan171

2 Comments

Sorted by latest first Latest Oldest Best

 

@Eichhorn148

I'm coming back on my question after long time. I have done which was suggested by you. Specially, I was talking about dynamic pages which are generated by narrow by search facility. You can see following pages to know more about it.
vistastores.com/table-lamps?material_search=1328 vistastores.com/table-lamps?material_search=1328
I have set canonical tag to base URL which is as follow.
vistastores.com/table-lamps
But, I'm not able to see any improvement in ranking as well as restriction of crawling.

If you will check first two URLs so you will come to know that all dynamic pages are not 100% duplicate to base URL. Because, all dynamic pages have different products compare to base URL.

So, How can we say that, it' duplicate. I have read Google's guidelines about canonical tag and come to know that, Canonical tag will work well during true duplicates.

I'm honestly worried with my current issue. Because, I'm not able to present my too many pages which contain unique products only due to duplicate page title.

10% popularity Vote Up Vote Down


 

@Kristi941

Are you blocking pages just because they're dynamic? If so, you're making a mistake as the search engines do index pages that are dynamic/have querystrings just fine. But if you're doing it because you're worried about duplicate content (e,g, the same URL pulls up the same content with only minor variations in its display) ) then just use canonical URLs to tell the search engines, specifically Google, which URL is the main URL that should be displayed in their search results and that all other URLs are duplicates of that one.

Also, by blocking pages within your own website means you will have PageRank loss as links to pages that cannot be indexed can still receive the PageRank but it obviously cannot "send it out" since the page cannot be crawled and indexed. It also means you'll be missing out on some internal linking. Links from your own pages carry value just like links from external pages (although the value of those links probably vary).

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme