Mobile app version of vmapp.org
Login or Join
Caterina187

: Is it OK for SEO and user experience to 301 redirect self-generated param URLs? I am using a CMS which offers a wide range of options which lead to many additional special pages mainly in

@Caterina187

Posted in: #DuplicateContent #Parameters #Seo #WebCrawlers

I am using a CMS which offers a wide range of options which lead to many additional special pages mainly in a way duplicating the same content as the original article. These generated pages are accessible via URL parameters and are automatically linked to from other pages by the CMS.

My problem is that I want to get rid of these special pages because...


in my eyes they contain hardly any worth for my users
they generate duplicated content
search engine waste their time crawling all these generated pages


Unfortunately I am not able to deny the CMS generating those special pages and even if I could many of this unwanted URL parameters are already known to the search engines.

So I plan to use a .htaccess 301 redirect to remove each URL parameter leading to such special pages. So for example:

article.php?userid=1&highlight=red&background=blue

would be 301'ed to

article.php?userid=1&highlight=red

and this in the same call to the final call of

article.php?userid=1

So from an SEO perspective, I will have dozens of different links with parameters set in my very own website which actually via 301 all lead to the same webpage. My hope is that search engines would not consider them as duplicates and after a while would stop crawling them (as I also but with no effect told them in search console and the like).

For the UX I only see the drawback that those redirected links will not have the promised effect, but I think I could neglect that as those links are very rarely used.

My question is what harm this may do concerning UX and SEO when I (multiple) 301 redirect pages my website is actually linking to?

Maybe someone even has a better way to handle this situation?

10.01% popularity Vote Up Vote Down


Login to follow query

More posts by @Caterina187

1 Comments

Sorted by latest first Latest Oldest Best

 

@Heady270

There is a better option, at least as far as Google is concerned. In Search Console there are settings for URL parameters. You can tell Google that certain parameters don't change the content on the page. Then Googlebot will stop crawling them and instead crawl "one representative URL."

Here is Google's help page where they explain how to use this feature: support.google.com/webmasters/answer/6080550?hl=en


Your 301 redirects are not ideal, especially if you can't stop your site from linking to the redirecting URLs. Googlebot will continue to crawl the URLs even after you 301 redirect them. The redirects would prevent Googlebot from finding duplicate content, but Googlebot is pretty good about detecting and handling duplicate content anyway. It usually just picks one of the pages to index and ignores the others. Some duplicated pages like this on your site won't harm your rankings. See What is duplicate content and how can I avoid being penalized for it on my site?

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme