Mobile app version of vmapp.org
Login or Join
Bethany197

: Crawl Budget economy: what to do with URLs with tracking parameters? So, a site has inbound links to a lot of important pages and these links have URL parameters. These URL parameters are basically

@Bethany197

Posted in: #CanonicalUrl #Googlebot #Pagerank #Seo #UrlParameters

So, a site has inbound links to a lot of important pages and these links have URL parameters. These URL parameters are basically used just for tracking purposes, similar to the way Amazon links to its logo: www.amazon.com/ref=nav_logo
Each linked page can have different URL parameters based on where these links appear (header, footer, sidebar, in content, etc.)

And the pages (that are being linked by a use of URL parameters) have canonical tags with URLs pointing to the original pages without URL parameters, but the thing is that there are probably no direct links on the whole site to these pages without URL parameters.

Now I'd like to preserve crawl budget and here are the options I have:


In Google Search Console, I can setup URL parameters to mark them as non important or to to block their crawling, to crawl them or let Googlebot decide
to block crawling of these URLs with parameters via robots.txt


So, if I block their crawling in order to preserve crawl budget, Google won't be able to see canonical URLs and the only way to let Google know about canonical pages is sitemap.

But, without Googlebot being able to crawl the live URLs on pages, the way they're appearing on pages, it won't be able to follow the links and give them any weight, authority and relevance to these pages. Is that right?

What is the best scenario for me to let the links stay with URL parameters but still get all the weight/authority while crawl budget is also optimal?

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Bethany197

2 Comments

Sorted by latest first Latest Oldest Best

 

@BetL925

I recommend your option of using the URL Parameter Tool in search console. For each of your tracking parameters, add it to the tool and set it to "Doesn't effect page content (ex. tracks usage)."

When you tell Googlebot that these are passive parameters, Googlebot will usually crawl just one URL with a specific parameter value. If the parameter can be removed and the Googlebot finds a link to the URL without the parameter, it will generally prefer to omit the parameter and value completely. Then Google's algorithms can assign all the weight from all the inbound links to this one parameter.

If you want control over which value of the tracking parameter appears in search results, you could use the rel canonical link tag in your pages to specify how you would like search engines to index the page. Using rel canonical links won't prevent bots from crawling many duplicate URLs, but it will consolidate link juice and let you choose which you want to have indexed.

10% popularity Vote Up Vote Down


 

@Megan663

But, without Googlebot being able to crawl the live URLs on pages, the way they're appearing on pages, it won't be able to follow the links and give them any weight, authority and relevance to these pages. Is that right?


In theory yes, but my personal interpretation of having backlinks to pages that are not accessible to search engines is that they do play a minor role. They aren't carrying all the link authority they could have when they were accessible, but at the same time I don't think they are completely disregarded.


What is the best scenario for me to let the links stay with URL parameters but still get all the weight/authority while crawl budget is also optimal?


This is a classic topic of: link authority vs. crawl budget. These are always challenging (and fun!).

An SEO focusing on linkbuilding will tell you to go for link authority.

A technical SEO will tell you to go for crawl budget.

I like to think of myself as an allround SEO. I therefore recommend a combination (if that's an option - I don't know the scope of this issue) which requires manual work, but if your backlinks to these pages are strong enough I guess it's worth it:


Apply to pages that have backlinks to preserve link authority.
Disallow pages that have no backlinks to preserve crawl budget.


Let me know if you have any questions.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme