Mobile app version of vmapp.org
Login or Join
Frith620

: URLs with parameters & canonical tags. Block them via robots.txt vs GSC URL parameters? I have some URLs with parameters, the parameters aren't of the filtering kind, they don't create millions

@Frith620

Posted in: #CanonicalUrl #GoogleSearchConsole #RobotsTxt #Seo #Url

I have some URLs with parameters, the parameters aren't of the filtering kind, they don't create millions of duplicates. The parameters are simple being used for tracking purpose, content doesn't change on the pages.

So, these URLs with parameters get linked from index page and some other authority pages of a site, and we block those URLs with parameters via robots.txt, however we also include canonical tags in these URLs with parameters, pointing to the original URLs without parameters.

Now I understand that pages blocked via robots.txt won't be crawled by bots, so bots won't see canonical tags nor they will see content of blocked pages and surely they won't take into consideration anything that's on those pages, including valuable inbound links.

I remind you, that high authority pages on the site is linking to those URLs with parameters, thus these URLs with parameters do not get their page weight boost...

What's the best way for me to take care of this situation and avoid wasting Google's crawl budget and also let my pages get the most of the weight from authority pages?

Do I unblock these pages in robots.txt but rather go to Google Search Console and configure URL parameters to set them as they don't affect page content, so Google won't crawl all instances of URLs with that parameter?

Do I keep canonical tags on these URLs with parameters?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Frith620

1 Comments

Sorted by latest first Latest Oldest Best

 

@Candy875

There are several questions to answer here:


What's the best way for me to take care of this situation and avoid wasting Google's crawl budget and also let my pages get the most of the weight from authority pages?


As you say the parameters do not create millions of new URLs. So it would be okay to let Google crawl them. Your Canonicals do the rest of the work.


Do I unblock these pages in robots.txt but rather go to Google Search Console and configure URL parameters to set them as they don't affect page content, so Google won't crawl all instances of URLs with that parameter?


Do not handle parameters in Search Console. First, it only works for Google and none of the other Search Engines. Second, no one but you knows about the parameter handling and can do changes to it if necessary but you. Third, parameter handling in Search Console can not be easily tested by crawling your page.
Use canonical link elements instead.


Do I keep canonical tags on these URLs with parameters?


Definitely yes - keep a canonical link pointing to the URL without parameters. If set up properly Google will understand your canonical linking scheme and respect it.

To sum it up:

Unblock your parameters in robots.txt and make use of canonical link elements pointing to the non-parameter-URLs.
This is the perfect use case for what canonical link elements have been invented.

Maybe one day you'll find a way of tracking internal traffic without using parameters.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme