Mobile app version of vmapp.org
Login or Join
Cugini213

: Is making AJAX site crawlable AND degrading gracefully with JS turned off possible? According to this spec, making AJAX site crawlable by Googlebot means that you have to use hashbang (#!) links

@Cugini213

Posted in: #Ajax #GoogleIndex #Javascript #Seo

According to this spec, making AJAX site crawlable by Googlebot means that you have to use hashbang (#!) links in it which means it won't degrade gracefully when JS is turned off. This might mean that crawlability and graceful degradation are mutually exclusive. Is it in fact so? Is there something that can be done about that?

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Cugini213

2 Comments

Sorted by latest first Latest Oldest Best

 

@Sue5673885

Yes, you can implement node.js based SEO server that runs javascript on Server side and present the webpage it to crawlers just like it is rendered in browsers.

You also need to implement the redirection rule for "_escaped_fragment_"

RewriteCond %{QUERY_STRING} ^_escaped_fragment_=(.*)$
RewriteRule ^$ /handler.php?_frag=%1 [L]

10% popularity Vote Up Vote Down


 

@Nimeshi995

You should check out:

Stack Overflow: Is making an Ajax site crawl-able and degrading gracefully with js possible?

Snippet:


When possible, I like to only use AJAX to load new pages when
history.pushState is available. When history.pushState is not
available, I fall back to non-AJAX. While this may be a sub-par
experience for those without history.pushState, it makes sure the URL
is always pointing to the right place and that the site will be
accessible to both Google and users with JavaScript disabled.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme