Mobile app version of vmapp.org
Login or Join
Hamm4606531

: Why not AJAX'ify entire websites? Is there any solid reasoning as to why sites shouldn't be developed with ajax functionality that loads major parts of each part (assuming there are elements

@Hamm4606531

Posted in: #Ajax #BestPractices #Javascript #WebDevelopment

Is there any solid reasoning as to why sites shouldn't be developed with ajax functionality that loads major parts of each part (assuming there are elements like the header, navigation etc that remain the same)?

Surely it would be less resource-intensive since the server wouldn't have to serve content that appears on every page, benefiting both the host and end-user.

Answer the question taking into consideration:


The sites javascript behaviour degrades gracefully in every instance
For my question I'm talking about new sites where this behaviour could be implemented rather from the off, so it doesn't technically cost any money - we're not returning to a finished product to implement it.

10.08% popularity Vote Up Vote Down


Login to follow query

More posts by @Hamm4606531

8 Comments

Sorted by latest first Latest Oldest Best

 

@Sarah324

Since you stated that it would degrade gracefully for visitors with Javascript disabled, I can only see two real issues (and one possible issue) that might come up.

Bad For Accessibility

Screen readers and other assistive technologies are often thrown off by dynamic DOM changes. They process & read the page in a linear fashion, and changing the content of the page after it's loaded might not be correctly handled.

There may be techniques to get around this, but I haven't looked into it too thoroughly.

Increased Complexity

Maintaining this kind of site could be tricky. For one example: If you created a new layout and changed the ID of the content area you were replacing with your AJAX links, it could break your navigation scheme in a pretty perplexing way.

This kind of AJAX behavior would also complicate any traffic analysis you may be doing; Google Analytics wouldn't properly register these AJAX loads without a manual call to pageTracker._trackPageview('this_page');.

Adding more complexity to how your page operates also raises the bar for new developers; anyone working on the site would probably have to be made aware of how this behavior affects page loads.

Possible: Slower Page Load on Initial Visit

Depending on how you structure things, this page fetching AJAX code would only be able to kick in after the document was fully loaded. So, only after your visitor downloaded the whole page, and then the Javascript (if it was an external file), and their browser rendered it and fetched the content via AJAX, would they see the page content.

Each subsequent link clicked would be faster, but fetching the first page a user visited would actually take longer than a static version.

The reason I labeled this as a possible issue is that you could always send the first page statically (since you'll have the static version as a fallback already) and then use AJAX for the subsequent links.



For what it's worth, this doesn't sound like a terrible idea to me - especially for bandwidth-sensitive uses like mobile pages. You'd have to carefully weigh the drawbacks to make sure it was worth it in your case, though.

10% popularity Vote Up Vote Down


 

@Smith883

Having ajax elements on a page is fine when you have a small user base, but when you have more traffic; you want to use a more static approach to lessen resource abuse.

For example: say you have 200 people trying to access a page a second. You have about 7 database queries for your ajax calls; that is 1400 database calls a second, which can bog down a website.

A website that needs to be design for higher traffic should have static outward facing pages, for content that can be displayed in a static manner. This is achieved by using a server side script that runs every second to rebuild the static page, which is fetched for the end-user, and thus you have reduced your load from 1400 database calls a second to 7.

It is a SOA approach to website building.

10% popularity Vote Up Vote Down


 

@Pierce454

First things first

The pros


AJAX can allow you to use a common "base" page and just load the content areas, which can cut down on the load time for users, since a large part of the page is already loaded.
Can allow some eye candy such as fading the content area in and out.


The cons


Doesn't play nice if the page is downloaded.
Can mess with disability devices.
Viewers with javascript turned off won't be able to use the content at all unless a non-javascript version is employed as well.
A lot more work (does it really need to be said?).


Now for your question

Assuming your site degrades gracefully for those without Javascript, how well that turns out depends on how it's done. For example, if you just display a link to a non-javascript version out of the blue, it's an inconvenience for those viewers to have to click another link. On the other hand, if there's a noscript "main page" which would be using traditional links, that works better for most users, but still lacks support for those using disability devices, instances where the user comes for a specific "page" from a link, etc.

All in all, in the world of increasingly fast web connection, it doesn't really justify cutting off a small amount of filesize (we'll presume all the Javascript itself, the CSS, and the images can and will be cached, leaving just the "base" page itself to save bytes on) for the cons that it can give, namely the extra difficulty (though that's not always a con -- challenge is good) and the lack of support it can give for some users.

All in all, I'd say it's up to you, it'd probably work out pretty good, and for the vast majority of users, they'll likely see the site as intended, but personally, I'd say don't bother, as it's not worth the trouble for such a marginal improvement to filesize.

10% popularity Vote Up Vote Down


 

@Berryessa370

Yes it should be, however it should be the other way around.

The common parts of the page should be send over HTTP. Then a small ajax (or even better websockets) control loads the dynamic content asynchronously.

Of course you need to first detect whether javascript is enable (by cookie) and only use this method if it's enabled.

So you need the normal full HTTP route and then a websockets route. This requires code duplication unless you use a tool like node.js

Most people "think" that it's just not worth the extra effort. And sometimes it's not.

As an aside a lot of people ajaxify pages wrong. They actually decide that you don't need a non-javascript version and you need all these weird hash bang urls and broken forward/back buttons. Doing ajax properly requires HTML5 history (IE9 doesn't have it). It also requires to developer to complete versions of your website.

10% popularity Vote Up Vote Down


 

@Berryessa370

Check out gawker.com/ - this site almost completely loads after the fact. They use "hashbangs" (http://mydomain.com/#!some_section) to determine which content page should be loaded, the main navigation stays static.

Check out mtrpcic.net/2011/02/fragment-uris-theyre-not-as-bad-as-you-think-really/ for a short tutorial on the concept Gawker used.

There are pros and cons, you have to consider search engines (see code.google.com/web/ajaxcrawling/docs/getting-started.html), people with javascript disabled, and do a lot of testing.

With all that said, the biggest argument against them is probably that when a user waits for a page to load, then has to wait for more loading, they might be impatient. In my view, the best practice is to load the main site, navigation, and primary content in one pass (on request), and save the AJAX for the non-essential incidentals. That works with the idea of progressive enhancement, and mixes the best of both approaches.

10% popularity Vote Up Vote Down


 

@Kristi941

In practice it's a lot of work producing a 'fully AJAX' website, especially for major websites which are very complicated. Some websites that attempt this are Google and Facebook, but even they don't do it perfectly.

Common issues are navigation (ie, forwards and backwards), and bookmarking, but there are many other bugs that a lot of developers would rather not have to deal with. And it basically means creating two versions of the website to be compatible with Javascript and non-javascript users (and fixes for all the browsers with bad AJAX support).

10% popularity Vote Up Vote Down


 

@Cody1181609

Because it's probably just not necessary.
Loading basic HTML documents is simple and works. Introducing Ajax adds an entire other layer of process for browsers, code, and maintenance for the Javascript, back end stuff, weird hashbang URLs, and so on. Sometimes this can be justified, sometimes not. It might save you some server resources(might), but will that be enough to offset the upkeep? You have to evaluate that per-project.

As an example, when Twitter got its latest redesign, they took the approach that it wasn't just a web(page) site, but an application, and the entire thing is heavily Ajax-based, even though most of what it does could be handled with regular page requests. One of the biggest problems, which still happens now though much less, is arriving there and being greeted with a blank page because something in the Ajax failed.

10% popularity Vote Up Vote Down


 

@Pope3001725

If the content can be reached without JavaScript enabled then your question doesn't make any sense. It isn't "entirely Ajaxified" if you can get to the content through other means. Really, what you're asking is, "is it okay to enhance my user's experience through Ajax?". The answer is obviously "yes".

edit

Back when Google came out with its crawlable Ajax proposal, it was panned as a really bad idea. Makes for an interesting read.

10% popularity Vote Up Vote Down


Back to top | Use Dark Theme