: Loading main javascript on every page? Or breaking it up to relevant pages? I have a 700kb decompressed JS file which is loaded on every page. Before I had 12 javascript files on each page
I have a 700kb decompressed JS file which is loaded on every page. Before I had 12 javascript files on each page but to reduce http requests I compressed them all into 1 file.
This file is ~130kb gzipped and is served over gzip. However on the local computer it is still unpacked and loaded on every page. Is this a performance issue?
I've profiled the javascript with firebug profiler but did not see any issues. The problem/illusion I am facing is there are jquery libraries compressed in that file that are sometimes not used on the current page.
For example jquery datatables is 200kb compressed and that is only loaded on 2 of my website pages. Another is jqplot and that is another 200kb.
I now have 400kb of excess code that isn't executed on 80% of the pages.
Should I leave everything in 1 file?
Should I take out the jquery libraries and load only relevant JS on the current page?
More posts by @Shakeerah822
4 Comments
Sorted by latest first Latest Oldest Best
~700kb of JavaScript is a performance issue and if compressed and we have to see the Rules to be followed while dealing to optimize the Code are:
Minify Javascripts - Simply you are compressing and decompressing, which didn't reduce the code, First of all use the good Minify JS tool and Minify your code. You are 12 Files and each file would be Minify sepratly before clubbing for best performance.
Use asynchronous javascript loading, by Using asynchronous loading results in a very fast load time and rendering of the page. The user impact is very strong because good asynchronous loading won't block the rendering process and the feeled page load time is heavily decreased. Images and other displayed items regulary will be shown as of no javascript is loaded.
Use GOOGLE cdn for JQUERY; I think you are using JQUERY and loading it from your own website which is also a added dis-advantage, kindly use the GOOGLE CDN (free) to load the JQUERY. As it is being used by almost every 3rd website and thus already available on client computer in cache.
Custom long Expires Headers: Some how the website is loaded with issue of loading time then you must give the Long Expires for HTTP JS Files, so that they can't be downloaded every time, which will reduce the request for second time. According to my research, loading time taken on second page have more exits compared to first time page visit.
Check with Page Speed: Some times other resources also effects the loading speed of the page kindly cross check and also try to optimize other resources also. As doing bit of step on every resource give a extra time to our JS loading.
700kb of JavaScript IS a performance issue, because it must be parsed after page load. Because of it, you should take care, that only those scripts, that are needed, are loaded. One big JavaScript may be OK on full AJAX sites, such as GMail, when the navigation is handled internally without leaving the single page. However, even full AJAX sites do dynamic JS loading to prevent loading unnecessary JS on start (both memory and speed isssues).
You've said you've wanted to reduce http traffic. You should play with HTTP caching. The issue is, that the changes to the JS may not be visible until the cache expires, if you set expire time too high.
There is a trick, however, that you don't refer to myscript.js, but myscript.js?version={myversion}. After application update you change the {myversion} and you force JavaScript reload.
If your framework/CMS/whatever has the appropriate functions, you can include the scripting conditionally as @Michael suggests, but without the additional library.
Taking your datatables case, for example, WordPress might handle the situation via something like:
// For reference; this isn't functional code.
if (is_page('whatever')) {
<script src="/path/to/datatables.js"><script>
<script>
[Your datatables setup here]
</script>
}
There's nothing wrong with RequireJS; you just need to evaluate whether the additional level of complexity it adds(plus learning to use it in the first place) offsets what more readily-available tools can do for you. If you only have the two cases you mentioned above, this might be a better option. If you've got a lot more going on, then RequireJS might be a better approach on the whole.
You can use requirejs to dynamically load the libraries you need only on that pages. Then you only have to load the requirejs (which is about 14k) on all pages, saving about 385kb.
Integration is also very easy: just "wrap" the code you have with the require include stuff:
require(["jquery", "jquery.alpha", "jquery.beta"], function($) {
//the jquery.alpha.js and jquery.beta.js plugins have been loaded.
$(function() {
$('body').alpha().beta();
});
})
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.