Since the time of writing nearly a decade has passed. A lot has happened. We now have additional attributes such as
async to promise the browser we don't use garbage like
document.write. This is great, but unfortunately browser support could be better. As usual there seems to be a correlation between people using old browsers and people having mediocre (at best) Internet bandwidth. Ouch! So far to think again... Seems I am wrong here, or?
- CommonJS (most prominently used by Node.js; synchronous at heart)
- AMD (advocated by RequireJS; asynchronous at heart)
With the introduction of the ES6 module system the disadvantages of both seem to have converged and we can now write elegant code that is asynchronous without much trouble. Very nice! Independent of the used module system there are other advantages of using modules. Using modules we can only include the code we really need - at least at a library or module level. If modules would be as small as functions (please don't), then this would be very fine granular and reflect just the code that's needed.
However, there is yet another big advantage. We can simply drop everything from our webpage and only include the loader script. This one triggers all subsequent downloads. These downloads are all asynchronous and - if possible - executed in parallel. Therefore, the usual cascade we expect from the browser, as well as the delay we may expect is mostly a thing of the past.
In order to profit as much as possible from such a loader we have to include the loader in the head. The loader is usually very small and should be cached (it won't change often and may be taken from a non-evil CDN). Therefore, even browsers that do not support
defer shouldn't suffer too much from this change. What have we gained?
First, the script starts downloading earlier than it would be at the bottom of the page. Second, since the script downloads another script (our main script), which may trigger zero to many downloads (of dependencies) these downloads are started earlier and - at best - executed in parallel. The page loads faster. Finally, the loader clearly specifies what dependencies are required. This helps to prevent unnecessary deployment issues or doubt about the actual requirements. It's all very well specified.
TL;DR: If you still have any kind of webapplication without a proper loading script then its time to switch gears. I have talked to many people over the years and I get the impression that after a decade they finally understood why scripts should - in general - be put at the bottom of the page. Unfortunately, this is already one generation behind the current state of the art.
Remarks: Generally, one should still use gzip and a (non-evil) CDN. The CSS, however, should be served from the same domain (e.g., Chrome uses 2 TCP/IP connections to serve resources from the same domain). Bundling all JS in one file is not necessarily the ultimate solution. In best case files should be served in 15 kB chunks. The slow TCP/IP start with just 10 packages (remember 1 package MTU is 1500 bytes) also favors HTTP 2 batching and CSS sprites. Finally, the link prefetch is also a good way to gain some speed.