When downloading multiple commonly used javascript/css files (e.g. boostrap and jquery), many topics like this one recommend the use of a CDN, with one of the main arguments that it can then be used to load them asynchronously.
How does that work? To the best of my knowledge, <script>
tags in the header are read synchronously, so it won’t actually look at the second CDN file until the first one is finished.
<script src="//ajax.googleapis.com/ajax/libs/jquery/3.5.1/jquery.min.js"></script>
<script src="//cdnjs.cloudflare.com/ajax/libs/popper.js/1.16.0/umd/popper.min.js"></script>
<script src="//maxcdn.bootstrapcdn.com/bootstrap/4.5.2/js/bootstrap.min.js"></script>
How can I make the page download the scripts asynchronously, but execute them synchronously? Or is that actually happening by default somehow? And what about CSS files, will my
<link rel="stylesheet" href="//maxcdn.bootstrapcdn.com/bootstrap/4.5.2/css/bootstrap.min.css">
behave any different in that sense? I would like to understand the loading process properly before adding my own failovers to local code (for if the CDN is down), as to prevent getting stuck with synchronous downloading.
(Note that, despite the near-identical title, this is not a duplicate of this question, which is about loading scripts dynamically.)
Also note that I can’t use defer
(at least in the vanilla way that I know) as that would prevent me from adding said failover when the CDN is down, e.g.
<script src="//netdna.bootstrapcdn.com/twitter-bootstrap/2.2.1/js/bootstrap.min.js"></script>
<script> $.fn.modal || document.write('<script src="Script/bootstrap.min.js">x3C/script>')</script>
would be broken by simply adding defer
.
2
Answers
I think you can still use
defer
, just put your fallback code into an event handler…https://developer.mozilla.org/en-US/docs/Web/HTML/Element/script
… so
DOMContentLoaded
could be a good pick.Or, you can also put the fallback code into a separate
.js
file, and then it can be loaded withdefer
too, relying on the bottom part of the quotation, so the in-order execution.It’s more about parallelism than asynchronousness. (They’re certainly related, but the CDN argument related to limits on multiple downloads from the same origin is about parallelism.)
Any decent browser, when given the three script tags you’ve shown, will download them in parallel (up to its parallel-from-the-same-site limit) and then execute them in order. You don’t have to do anything to make that happen. Browsers read ahead in the HTML to find resources to fetch.
Adding fallback scripts with
document.write
might complicate the browser’s ability to do that, or even prevent it, but you can ensure it declaratively using<link rel="preload" as="script" href="...">
(more on MDN). Combining that with fallback scripts for failed CDN resources, it might look something like this:Note that that doesn’t preload the fallbacks. You could, but then you’d be loading them even when the CDN was working, which wastes the end user’s bandwidth. The fallbacks would be for the presumably-temporary degraded situation where the CDN was unavailable, where a degraded user experience is probably okay. (You could even show the user an indicator of a problem when scheduling the fallback, like Gmail’s "something is taking longer than usual" indicator.)
If you’re bothered by repeating the URLs and you’re okay with
document.write
in small doses (as you seem to be), you can avoid duplicating the URLs by doing something along these lines:(Since that’s all inline script you’re unlikely to transpile, I’ve kept the syntax to ES5 level in case you have to support obsolete environments.)