Mobile app version of vmapp.org
Login or Join
Correia994

: Replacing url parameters tool usage with canonical / robots.txt to avoid crawl time For a long time I've been getting Googlebot found an extremely high number of URLs warnings, which I know

@Correia994

Posted in: #CanonicalUrl #Google #GoogleSearchConsole #Seo #WebCrawlers

For a long time I've been getting Googlebot found an extremely high number of URLs warnings, which I know don't necessarily indicate there is a problem, but I am trying to reduce the amount of links that are crawlable by Google to help him focus on what's more important.

I'm thinking of changing a URL Parameter settings with a canonical link. I have filter parameters which I've set as Narrows and Specifies in WMT URL Parameters:

These internal filter pages have value, but bringing the user to the main page without any filters is almost good enough, and I think this is a trade off I should make considering I have too many pages compared to my reputation, thus getting those warning messages.

Two options I'm thinking of:


Adding a meta canonical tag for urls with a filter parameters selected, to the page without the narrowing / specification of these filters.
Adding robots.txt to avoid crawling of those pages. (this will not waste the crawler's time trying to find the canonical meta tag, but will throw away the current link power these pages have to the trash, instead of passing it on the the page without any filter selected).


Would you recommend any of these?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Correia994

1 Comments

Sorted by latest first Latest Oldest Best

 

@Harper822

Specifying a canonical on the filter pages to the parent (non-filter) page is perfectly fine but you can also serve a meta noindex, nofollow on these filter pages as well and instruct Google what to do with these filter URL's (if they contain parameters) from the section of Google WMT you've already mentioned. Disallowing them in robots.txt is also another option but won't help remove them from Google's index if they are already indexed.

There's no harm in implementing all of these options to ensure that Google only crawls, indexes and places weight on the primary pages of your website which you'd want performing and ranking well.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme