Mobile app version of vmapp.org
Login or Join
Margaret670

: Why is honest cloaking for Ajax considered bad practice? Ajax-only websites are laden with many problems. Search engine invisibility is merely a small problem. They are also unbookmarkable and

@Margaret670

Posted in: #Ajax #Cloaking #Http #SearchEngines #Seo

Ajax-only websites are laden with many problems. Search engine invisibility is merely a small problem. They are also unbookmarkable and unlinkable and they break the functionality of back- and forward-buttons without hacks. Fundamentally, they are not how the web is supposed to work. Even so, Ajax is indispensable to the user experience and Ajax-only applications are cleaner than hybrids in both code and logical structure, more powerful, and more extensible.

I can think of three solutions:


Cloaking. While cloaking typically bears the connotation of spamdexing a page with irrelevant keywords served only to a crawler, I mean serving a no-script user agent a stripped-down HTML file with exactly the same content that an Ajax client would load asynchronously (including the same hyperlinks), but without the fancy UI. It's intended for obsolete browsers as an alternative to progressive enhancement, not just for crawlability. Rationale: it doesn't waste no-script users' bandwidth with script tags or script-enabled users' bandwidth with noscript content that will be loaded anyways in JSON format. Drawback: it raises a false flag of search engine ranking manipulation, of which search engines are apparently hypervigilant. It is also difficult (at best) to recognize all potential user agents and differentiate correctly between those that execute scripts and those that don't.
Hash-bang fragment identifiers. Rationale: It doesn't appear deceitful to Google. Drawback: it is an absolutely backwards hack that violates the W3C's standard (by using illegal URLs) in favor of a trick invented by a third party. URLs are no longer Uniform Resource Locators; they are pointers to application entry points plus macros that Google and modern browsers luckily follow. This technique does nothing for true no-script browsers and it breaks any technology that follows the W3C more strictly.
Progressive enhancement. Rationale: this is championed as the way to use Ajax while adhering to the original vision of the web. It lets user agents with different capabilities act differently toward the same resource, and a hypervigilant client doesn't think it's being deceived. Drawback: it wastes bandwidth with script tags at best and redundant content in noscript tags at worst, defeating part of the purpose of Ajax.


If not for search engine hypervigilance, I would consider cloaking to be the best solution, although it would be much easier if clients would merely send an HTTP header (or lack thereof) indicating if they will execute scripts. Is it really that bad of a practice? Is there a way to prevent false flags of search engine manipulation? Or am I too paranoid of false flags in the first place?

10% popularity Vote Up Vote Down


Login to follow query

More posts by @Margaret670

0 Comments

Sorted by latest first Latest Oldest Best

Back to top | Use Dark Theme