Skip to content

Instantly share code, notes, and snippets.

@Rich-Harris
Last active May 6, 2024 10:23
Show Gist options
  • Save Rich-Harris/41e8ccc755ea232a5e7b88dee118bcf5 to your computer and use it in GitHub Desktop.
Save Rich-Harris/41e8ccc755ea232a5e7b88dee118bcf5 to your computer and use it in GitHub Desktop.
Why imperative imports are slower than declarative imports

Why imperative imports are slower than declarative imports

A lot of people misunderstood Top-level await is a footgun, including me. I thought the primary danger was that people would be able to put things like AJAX requests in their top-level await expressions, and that this was terrible because await strongly encourages sequential operations even though a lot of the asynchronous activity we're talking about should actually happen concurrently.

But that's not the worst of it. Imperative module loading is intrinsically bad for app startup performance, in ways that are quite subtle.

Consider an app like this:

// main.js
import foo from './foo.js';

foo();

// foo.js
import bar from './bar.js';
import a from './a.js';
import b from './b.js';

export default function foo () {
  bar( a + b );
}

// bar.js
import c from './c.js';
import d from './d.js';

export default function bar ( x ) {
  console.log( c + d + x );
}

// a.js
export default 1;

// b.js
export default 2;

// c.js
export default 3;

// d.js
export default 4;

When main.js is loaded and parsed, we can immediately determine (without having to run the code) that it has a dependency on foo.js, so we start loading that. Once it arrives, we see it depends on three other modules (bar.js, a.js and b.js), so they start loading concurrently. Whenever bar.js is loaded, we can set off loading c.js and d.js.

So from main.js, there's just three hops – the depth of the dependency graph (main -> foo -> bar -> c/d) – to load the entire app.

What if some of those imports were imperative?

// main.js
import foo from './foo.js';

foo();

// foo.js
import bar from './bar.js';
import a from './a.js';

const b = await import( './b.js' ); // <-- imperative

export default function foo () {
  bar( a + b );
}

// bar.js
import c from './c.js';

const d = await import( './d.js' ); // <-- imperative

export default function bar ( x ) {
  console.log( c + d + x );
}

// a.js
export default 1;

// b.js
export default 2;

// c.js
export default 3;

// d.js
export default 4;

Even though this is basically the exact same app, something curious has happened. We load foo.js, as before, triggering a subsequent load of bar.js and a.js (but not b.js, because we don't execute the code until all the dependencies have been loaded and evaluated). As soon as bar.js comes in, we load c.js (but not d.js).

The same three hops (main -> foo -> bar -> c), and we've got all the dependencies that are declared statically. Now for the next phase – evaluation. Evaluation order is guaranteed by the order of import declarations, so we start with c.js. Then we can evaluate bar.js. That's when we hit the await import('./d.js'). Evaluation pauses until the fourth load has completed and d.js has itself been evaluated. Then, with bar.js done, we can move on to foo.js, whereupon we hit await import('./b.js') and have to wait for a fifth load to happen. We load and evaluate b.js, then finish evaluating foo.js, then finally we can actually run our app by evaluating main.js.

We've gone from three 'waves' of module loads (the depth of the dependency graph, i.e. the distance from the entry point to the deepest dependency) to five – the depth, plus the number of imperative imports, since they have to happen sequentially. (And that's before we account for any dependencies those imperatively imported modules might have.)

Now imagine that in a less contrived situation, where you have dozens or hundreds of modules – it only takes a handful of await import(...) statements to seriously slow down your app startup. Let's say an app with 100 modules has a depth of, say, 6. It only has to have 6 imperative imports (fewer, if those modules have dependencies of their own!) and you've just doubled the length of time it will take to load all the modules to start your app!

Gross oversimplification? Possibly. The point stands – declarative imports are faster by their very nature.

And yes, there is a way to achieve dynamic module loading with declarative import declarations.

A couple of observations

  • require(...) is imperative, and thus subject to the same logic as we've described above. JavaScript modules will make your apps faster.
  • With or without HTTP2, dependency graph depth is a factor in startup time. You can reduce your dependency graph depth to zero by bundling your app. (Yes, that also applies to Node apps).
  • Of course, bundling everything prevents you from taking advantage of concurrency and caching. The optimal HTTP2 strategy is probably to bundle chunks of your app, e.g. leaving large third-party dependencies to CDNs.

Corrections welcome.

@JonathanWolfe
Copy link

I feel you are completely right about all these issues with top-level await. I hope the review bodies take notice.

@benjamn
Copy link

benjamn commented Sep 12, 2016

Even worse, top-level await introduces the possibility of deadlock during app startup:

Module main.js:

import "./a";

Module a.js:

await import("./b");

Module b.js:

await import("./a");

The fundamental problem here is that top-level await allows the author of module foo to delay the resolution of the Promise returned by import("foo") indefinitely. Without top-level await, the resolution of the promise returned by import("foo") can be delayed only by

  • a delay in fetching module foo over the network, which can be mitigated by bundling, timeouts, developer tools, etc.
  • expensive synchronous computation performed within foo, which should be easy to spot and postpone.

We already know how to cope with both of these delays, whereas a deadlocking cycle between modules using top-level await could involve many different modules, be sensitive to race conditions, etc. etc.

Now, it's conceivable that import("./a") could return an already-resolved Promise for a partially populated namespace object, if a.js is currently suspended on an await, so that b.js can continue executing, similar to how CommonJS handles dependency cycles. But that only "solves" the problem if all the Promises involved in the deadlock were obtained from an import(...) function. The risk of deadlock arises as soon as we have top-level await, even if we do not have an imperative import function.

@robpalme
Copy link

I simultaneously agree with this piece and also feel it's an obvious tradeoff not worth shouting about: runtime discovery will be slower if you were to convert a declarative use-case to imperative. But those use-cases are not why imperative import() was invented!

Also I feel this slightly exaggerates, or at least over emphasizes, the peformance hit. Which may be generating unfair FUD around import() and top-level await.

Yes, discovering imports at runtime is slower than having them being declaratively specified.

But... I don't think it's fair to say (or imply) that pure re-ordering of work OR permitting module body evaluation to span multiple JS engine ticks necessarily harms performance.

Counting the number of "waves" slightly confuses certain cost (deferred fetch) with potentially negligible costs (work re-ordering, multi-tick eval) .

Regardless, I appreciate Rich's efforts & education here.

@franciscop
Copy link

require(...) is imperative, and thus subject to the same logic as we've described above. JavaScript modules will make your apps faster.

This is arguably not so important in Node.js (if you are referring to that) or when compiling the code, since it will be done once on server start then the code is cached. The same doesn't happen in the browser, since each request would have to await.

@domenic
Copy link

domenic commented Sep 14, 2016

This is not generally true, as engines are able to speculatively preload even imperative imports.

@getify
Copy link

getify commented Sep 14, 2016

@domenic

const b = await import( `b.${fate ? 'mjs' : 'js'}` );

@getify
Copy link

getify commented Sep 14, 2016

@Rich-Harris

I believe the "solution" to the issues you present here is the same as it was for just regular ol' ES6 modules prior to all this blow-up... either your markup should send a bunch of <link rel=preload ..> preload tags to let the browser know it needs to start loading stuff early, or you need to use HTTP2 and have the server send down all the files it knows will be needed even though the browser hasn't yet asked for them.

Side note: the <link rel=preload> approach is the strategy I'm baking into the next-generation of LABjs.

@boneskull
Copy link

@franciscop

This is arguably not so important in Node.js (if you are referring to that) or when compiling the code, since it will be done once on server start then the code is cached. The same doesn't happen in the browser, since each request would have to await.

require() is a blocking I/O op, and countless Node.js modules (and especially short-running CLI tools) mitigate this cost with handrolled "lazy loading" implementations.

Reading from local FS may be an order of magnitude faster than HTTP, but declarative imports would absolutely make an impact in Node.js...

@robpalme
Copy link

@domenic "engines are able to speculatively preload even imperative imports" - this statement is also not generally true.

Speculative preload is possible if we restrict import to only accept string literals. But as soon as you can feed it dynamic specifiers all bets are off. With the exception of using historical data from past usage, which is getting tenuous.

Do you expect to advocate/encourage string literals in import() so that users stand a chance of benefiting from this speculative preloading?

@getify
Copy link

getify commented Sep 14, 2016

@robpalme @domenic

It's also not "generally true" when you consider that system loaders (IIUC) can be configured to transform a module ID that was received from the import / import(..) to something else, like an alternate URL for example. The speculative loading in this case could try to guess that this wasn't going to happen, but may in fact fetch a URL that's bogus, or just one that the code won't actually use.

I'm sure there will be cases where speculative loading could help, but to suggest it's the "general" case is a stretch.

@bergus
Copy link

bergus commented Sep 24, 2016

The argument here seems to be one against await import, not against top-level await in general. Standard import declarations can be fetched in parallel already.
Sure, calling System.import from a module instead of doing a regular import is probably an antipattern, but necessary in some cases. And top-level await would greatly simplify dealing with the exports that depend on the dynamically loaded module.

@ericandrewlewis
Copy link

How would you define an imperative vs. declarative import?

@masaeedu
Copy link

masaeedu commented Nov 1, 2017

The argument makes even less sense than in the previous installment: regardless of whether the language has top-level await, if the current module requires a dynamic import (or indeed the result of any promisified computation) before it can finish exporting whatever it's exporting, it will itself have to export a promise. Whether you use then or await for this is totally irrelevant.

By all means; if it is possible to refactor stuff so that all your imports become static, do that. Either way, the whole debate is orthogonal to the utility of top-level await as a language feature.

@determin1st
Copy link

determin1st commented Aug 2, 2021

import/export are restricted to https only
+more restrictions coming from spec beurocrats

good luck making ur lib with those by default, i better use ol good monoliths..
practice shows that they are major in non-corp web resources

folks keep saying that there was a "module war", everything was bad and suddenly the combination of those becomes good spec. the export could just been erased, all the vars that would become global could just jump into export. why do i, some random guy without spec shevrons find those shortcuts instantly? who are those spec'ers mass? they don't even have any identity, some plenary voting stuff :)) funny modules

@gopi-suvanam
Copy link

My take away from this gist:

Of course, bundling everything prevents you from taking advantage of concurrency and caching. The optimal HTTP2 strategy is probably to bundle chunks of your app, e.g. leaving large third-party dependencies to CDNs.

I hope it will be that and not the node way of downloading third-party dependencies and bundling them also together.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment