Mobile app version of vmapp.org
Login or Join
Cugini213

: Should we drop AJAX crawling scheme? So now Google has deprecated AJAX crawling scheme. They say not to bother implementing it in new websites, because it's no longer needed since Googlebot has

@Cugini213

Posted in: #Ajax #Googlebot #Seo #WebCrawlers

So now Google has deprecated AJAX crawling scheme. They say not to bother implementing it in new websites, because it's no longer needed since Googlebot has now no problem watching dynamic content. Should we immediatly trust this statement, or better to adhere to the deprecated standard for a while?

10.02% popularity Vote Up Vote Down


Login to follow query

More posts by @Cugini213

2 Comments

Sorted by latest first Latest Oldest Best

 

@Twilah146

Several other search engines (Bing, Yandex, etc.) still use the _escaped_fragment_ system. They're not going to stop using it overnight just because Google has. Thus, if you care about your site being indexable by search engines other than Google, you may want to still support this scheme.

Certainly, if you already have set up support for _escaped_fragment_ on your site, there's no reason to disable it. If you're developing a new site, you'll need to weigh the cost of adding this feature against the benefits (keeping in mind that Google currently has a near-monopoly on Internet search, and that in any case, other search engines will likely soon scramble to follow Google's example and implement better crawling of dynamic Ajax-loaded content).



Finally, note that in most cases, the simplest and most foolproof solution is to implement your site so that it doesn't need such tricks in the first place. At least 99% of the time, you don't really need any Ajax, or even client-side scripting at all. By avoiding unnecessary reliance on Ajax, and by designing your site so that at least basic browsing features work even with JavaScript disabled, you'll ensure the widest possible compatibility across browsers and search engines.

The trick to doing this efficiently is to first set up the basic functionality of your site using basic HTML and CSS and plain old links, with no JS at all. Once you've done that, you can add JS and Ajax on top of that for smoother loading and extra features, while still retaining a graceful fallback interface for users and search engines who don't support the extra features. If you start out relying on Ajax for everything, however, retrofitting a non-Ajax fallback interface later can be very difficult and awkward.

10% popularity Vote Up Vote Down


 

@Kevin317

Google already crawls and processes JavaScript so there is no need to implement the AJAX crawling scheme in new sites.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme