Mobile app version of vmapp.org
Login or Join
Barnes591

: Do the internal searches need to be by GET for SEO purposes? I have a site with products, which can be searched using a convoluted search form, with quite a few input fields. I send the

@Barnes591

Posted in: #Google #QueryString #Search #Seo #Url

I have a site with products, which can be searched using a convoluted search form, with quite a few input fields. I send the data by POST, to the same page, so I can list all products in the URL: /products/ and the search stays in the same page /products/. For SEO purposes, indexing and Google Analytics review, should I use GET instead, so I would have a quite lengthy URL :

/products/?category=book&search_title=Andromeda%20Strain&Author=Mich%20Chric%&order_by=price&sort=desc


Also, it is obvious that the URL above could surpass the magic 256 characters, even more with Unicode strings.

My SEO consultant says that I could put a code inside the form and still retain the POST method, but I have been reading a little and I am very confused.

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Barnes591

2 Comments

Sorted by latest first Latest Oldest Best

 

@Eichhorn148

While all internal search result urls contain no unique information, they are for SEO purposes irrelevant and harmful.

Making internal search with GET you produce search result urls, which you later should exclude from indexing (and from crawling, because of overspending of crawl budget for useless urls).

But using POST on internal search you a single search url like, example.com/search.html and don't get any of SEO-related troubles with gazillions of search result pages.

PS: Actually it isn't a big deal to exclude search result urls from indexing and crawling - you will need just a few hints:


create additional search console property for the /search/?q=term directory
apply the slowest possible crawl intensity for it
place in the root directory robots.txt with Disallow:/q=*
place in the /search/ directory a htaccess file containing Header set X-Robots-Tag "noindex, nofollow"


But using POST you don't need this at all.

Conclusion: Use GET, only if you can't use POST.

PPS: in the meanwhile i think, that after weighed the advantages of POST and GET i would for now prefer GET. In many undergone cases i realized, how valuable, specially for e-businesses, is the existence of unique URLs of search result pages, which can be bookmarked, shared and are gone with the POST approach. And all negative SEO issues of the GET approach can be (not easily, but pretty clearly) divided by zero

10% popularity Vote Up Vote Down


 

@Lee4591628

Internal search results shouldn't actually be indexed. Google's Webmaster Guidelines state:


Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.


Matt Cutts weighed in when this was previously discussed on Google’s webmaster help group.

I quite like the point made in this Webmaster World thread:


The issue isn't the "friendliness" of the URL, the issue is the content on those search results pages. Many a site has tanked their traffic by exposing a large number of site search results to Google. Duplicate content galore, including a lot of "no results" pages, can swamp your good content.


So try and stick to easily-crawled category & product pages. Leave out internal searches from indexing entirely.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme