: Is making AJAX site crawlable AND degrading gracefully with JS turned off possible? According to this spec, making AJAX site crawlable by Googlebot means that you have to use hashbang (#!) links
According to this spec, making AJAX site crawlable by Googlebot means that you have to use hashbang (#!) links in it which means it won't degrade gracefully when JS is turned off. This might mean that crawlability and graceful degradation are mutually exclusive. Is it in fact so? Is there something that can be done about that?
More posts by @Cugini213
2 Comments
Sorted by latest first Latest Oldest Best
Yes, you can implement node.js based SEO server that runs javascript on Server side and present the webpage it to crawlers just like it is rendered in browsers.
You also need to implement the redirection rule for "_escaped_fragment_"
RewriteCond %{QUERY_STRING} ^_escaped_fragment_=(.*)$
RewriteRule ^$ /handler.php?_frag=%1 [L]
You should check out:
Stack Overflow: Is making an Ajax site crawl-able and degrading gracefully with js possible?
Snippet:
When possible, I like to only use AJAX to load new pages when
history.pushState is available. When history.pushState is not
available, I fall back to non-AJAX. While this may be a sub-par
experience for those without history.pushState, it makes sure the URL
is always pointing to the right place and that the site will be
accessible to both Google and users with JavaScript disabled.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.