: Are (non-escaped) URL fragments bad for SEO? I have a product search page with a search filter with a fair number of different options. Each time the user changes the filter, the search results
I have a product search page with a search filter with a fair number of different options. Each time the user changes the filter, the search results are updated in-place by JavaScript, and the URL is also updated with an appropriate URL fragment (# parameter) to reflect the selection (for the benefit of bookmarks or sharing the URL with others). Pagination is computed server-side, but JavaScript completely recomputes this if the filters are used.
Am I right in thinking (hoping) that search engines will ignore the fragments in links, will ignore the fact that the pages they see will contain different products from those that would be filtered in by JavaScript from "real" browsers, and will therefore not think that we're trying to trick them by giving substantially different results between JS and non-JS browsers?
My intention is that the search engines will see the un-filtered, paginated search results pages, but not be able to trawl through the infinite space of combinations of custom searches (which is what would happen if we used Escaped Fragments). Is the use of (non-escaped) fragments the correct way to achieve this? I'm concerned that with the new standard practice of using Escaped Fragments to give search engines a view of dynamic content, that using pure fragments might be seen as a hack to try to hide things from search engines.
More posts by @Jamie184
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.