: While there are search engines that support crawling of ajax content, traditional urls and static page content are still more reliable for search engine visibility. When a significant (large)
While there are search engines that support crawling of ajax content, traditional urls and static page content are still more reliable for search engine visibility.
When a significant (large) portion of a page needs to be reloaded, it is usually desirable that ajax content be crawlable. In such cases, however, a redirect would usually be a viable (but undesirable) alternative.
I would suggest to initially code the page such that it redirects when large/significant portions of the page has to be reloaded (Step1). Meaning, us an anchor with a traditional url <a href="/page/1"> (or something similar) rather than <a href="#page=1">.
Afterwards, write up javascript (jQuery) to intercept redirection and use ajax to reload the portion of the page that would change if the page redirected(Step2).
<a id="page1" href="/page/1">Page 1</a>
<script>
$("#page1").click(function () {
/*
* Perform ajax here to replace a portion of the page.
*/
return false; // prevent redirection
});
</script>
The first step results in webpages with static content and traditional urls that work even without javascript while the second step makes the webpage dynamic (avoids static content).
More posts by @Smith883
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.