Mobile app version of vmapp.org
Login or Join
Reiling115

: How to prevent duplication of content on a page with too many filters? I have a webpage where a user can search for items based on around 6 filters. Currently I have the page implemented

@Reiling115

Posted in: #DuplicateContent #Filtering #Url

I have a webpage where a user can search for items based on around 6 filters. Currently I have the page implemented with one filter as the base filter (part of the url that would get indexed) and other filters in the form of hash urls (which won't get indexed). This way the duplication is less.

Something like this

example.com/filter1value-items#by-filter3-filter3value-filter2-filter2value


Now as you may see, only one filter is within the reach of the search engine while the rest are hashed. This way I could have 6 pages.

Now the problem is I expect users to use two filters as well at times while searching. As per my analysis using the Google Keyword Analyzer there are a fare bit of users that might use two filters in conjunction while searching.

So how should I go about it?

Having all the filters as part of the url would simply explode the number of pages and sticking to the current way wouldn't let me target those users.

I was thinking of going with at max 2 base filters and rest as part of the hash url. But the only thing stopping me is that it would lead to duplication of content as per Google Webmaster Tool's suggestions on Url Structure.

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Reiling115

2 Comments

Sorted by latest first Latest Oldest Best

 

@Heady270

If users are searching by combinations of two filters, then it makes sense to have a landing page for them. However, don't just blindly create pages for all combinations of two filters. Rather use Google trends or the Google AdWords keyword estimator to figure out which filter combinations will have search volume. Create landing pages just for the combinations that are likely to have search volume.

Make sure all landing pages created have unique title and meta descriptions and that they read well. This often takes a bit of thought and the ability to hard code some of the cases rather than rely just on rules to generate them.

It sounds like you already have the ability to link to just one filter and then apply other filters with the hash URLs. I like that approach. I have done something similar myself using two scripts that generate the filter pages but have kept one in robots.txt to control whether specific filter combinations are indexed.

10% popularity Vote Up Vote Down


 

@Si4351233

Do you want to rank against filter URL? No wont if do want that, because filter URL will mostly give the same content and Google will rank you for the best URL only.

I as an SEO expert suggest to disallow those filter URLs using your robots.txt and let the Google crawl only the main URLs. You can track the patten of filter URl and disallow by using 1-2 commands in robots.txt.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme