Javascript: Dynamically inject javascripts when you need them – Mr. Joel Kemp

There’s been a lot of discussion around ways of minimizing the impact of Javascript load times by loading scripts only when you need them. I wrote a plugin that allows you to do just that: $.inject

A little background

Nicholas Zakas has spoken quite a bit about this lazy-loading optimization in one of his talks

The gist is that window.onload (a callback indicating that the page has finished rendering and allows for interactivity) is only triggered when all of your JS has been fetched, parsed, and executed. This applies to JS that’s included in the <head> section, at the end of the <body> tag, and even asynchronously injected scripts (in either of those sections).

The downside to delaying onload is that the page is unscrollable and unusable (but rendered/visible) until onload fires.

The way around this is to dynamically inject your scripts when you need them. If a user clicks a button/link that uses a vendor library (and it’s only at that point that your app needs that dependency), then fetch those scripts at that moment and then perform the functionality after the script(s) have loaded.

You can tell when the scripts have loaded either by injecting a <script> tag into the <head> of your page and listen for onload of that script tag (or onreadystatechange for IE) or using $.ajax() with a dataType of 'script' and binding to the done method of the returned deferred.

A real-world scenario

As a concrete example, I needed a portion of jqueryUI only for a photo upload library that I was using. No other part of my app needed jqueryUI: which is 11kb to 15kb at the bare minimum, plus the http round trip time. In addition, after thinking about it some more, I really only needed the photo upload library when the user clicked a photo upload button. Hence, these two scripts were great candidates for an ordered lazy load.

I used my $.inject plugin and voila, the cost/delay of those two libs was shaved from window.onload. Yes, the user had to wait for the two http round trips before being able to start the upload flow, but it’s not a feature that everyone will be using. The majority of users shouldn’t have to take the performance hit for the features of the minority.

How does this work with bundled vendor scripts?

If you bundle your vendor scripts into one big JS file, then you won’t be able to avoid the parsing of those scripts (though, you’ll save the http round trips). Using RequireJS could help, but my gut is that the loader still runs/parses through the define calls to register the modules – only executing their definitions when required (no pun). Please correct me if I’m wrong on that one. So there’s still time spent parsing those unused vendor libs.

Alternatively, you could just bundle core vendor scripts and keep non-core ones to be lazily loaded.

It should be noted that mobile users will also suffer quite a bit from the increased download time of that massive merged and minified vendors javascript file.

Moving forward

To some, this might feel like a premature optimization. However, I believe that this type of cost-cutting is an easy and essential win – for both client-side and server-side rendered apps. Perhaps more-so for the former – where you want to get to rendering as quickly as possible. The latter (server-side rendered apps) still benefit in that not only is your content loading incredibly quickly, but now the user can start using the page that much faster.

Anyone else using this lazy-loading method to great success?

Happy coding!