Mobile app version of vmapp.org
Login or Join
Murray155

: Block Google from crawling low value pages I recently analysed the week-on-week crawl request made by google bot to my server. Here is the graph for the same. Duplicate requests are when more

@Murray155

Posted in: #CrawlRate #Googlebot #Seo #WebCrawlers

I recently analysed the week-on-week crawl request made by google bot to my server. Here is the graph for the same.


Duplicate requests are when more than 1 request was made by google-bot for the same page within a week.
I am concerned about the amount of duplicate requests that the bot is making. This results in a significant waste of its bandwidth for crawling my website. On analysing the pages being requested by bot, I could see that a lot of requests were made for pages with low value(less content, lower SERP). Such pages are like list results for certain combination of filters in which majority of the combinations do not offer much value.
So, I am pondering whether it is wise to block the google-bot from accessing such low-value pages and concentrate on high value sitemap pages.
I had already removed such pages from my sitemap.

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Murray155

1 Comments

Sorted by latest first Latest Oldest Best

 

@Reiling115

As I understand you have many pages that are less important that shouldn't be crawled (that often).

I don't know if it will help removing the pages from the sitemap. The sitemap is not a selection of pages that will be crawled. When a search engine finds links outside the sitemap, they will just as well be crawled.

Did you consider assigning a lower priority for these pages in your sitemap? This source from Google Webmasters Blog indicates that the 'priority' attribute helps in priorizising pages for search engines. Some articles about SEO claim that the pages with lower 'priority' will get crawled less often.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme