Mobile app version of vmapp.org
Login or Join
Annie201

: SEO best practice when using listing with multiple pages I have a website with multiple product categories. Those categories can be sorted (Price, rating, etc.) showing a maximum of X number

@Annie201

Posted in: #Pagination #Seo #Title

I have a website with multiple product categories. Those categories can be sorted (Price, rating, etc.) showing a maximum of X number of products per pages.

From the SEO point-of-view, what is the most efficient way to manage page titles, avoid duplicate title tags, and help indexing the listing of my products in the best possible way?

<title></title> examples:

Microphones
Microphones, page 2
Microphones, page 2 sorted by price


I know this is enough to make the titles unique, but is it relevant and SEO-wise?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Annie201

1 Comments

Sorted by latest first Latest Oldest Best

 

@Debbie626

First you need to implement proper pagination canonicals, either as rel=prev/next or as a "view all" page using a catchall URL. Here is a guide to do that support.google.com/webmasters/answer/1663744?hl=en
Keep in mind, if you use a "view all" page, its canonical must be static, meaning you cant just use /my-category&limit=234 since it would often change. Instead, a static URI such as /my-category&limit=all works much better.

After you set those up, head on over to Google webmaster tools and visit "Crawl > URL Parameters". This is where you teach Gbot how to use parameters, sorts, limits, filters, or any other querystring that mitigates the page. So you can say to Gbot "Hey, if you see something like &sort= it means sort the results by X parameter". You can do the same thing for Bingbot although its more of an "ignore this" than a "learn this" situation.

Finally, if you have extra/utility querystrings that arent meant to be used by public users you can disallow them in robots.txt. This is normally not recommended unless there is something like API endpoints or special strings for the URL to turn it into a JSON response. But be careful what you disallow, it could very well bite you in the butt later.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme