Skip to content

Instantly share code, notes, and snippets.

@justinfagnani justinfagnani/README.md
Last active Jul 1, 2019

Embed
What would you like to do?
Inline JavaScript Modules Definitions

Inline JavaScript Module Definitions

Motivation

Domenic's blöcks proposal outlines a way to conveniently define functions that run in another worker/worklet as an inline, non-capturing scope, function body.

Blöcks as proposed have a few open questions and lack a few features that could generalize them to more use cases and with more practical ergonomics.

  • Blöcks don't allow static imports, which makes it harder for them to import neccessary library code. They must rely on dynamic import, which is somewhat more difficult to statically analyzer.
  • Blöcks implement only one function body. It's unclear how that function could share state with another if necessary.
  • There are some awkward syntax and semantics around simulating capturing variables via structured clone and transferrables.

I think these issues could be addressed with a variation on the blöcks idea that breaks out some of its features into separate layers and generalizes them:

  • Inline Modules: Replace blöcks as a function-like declaration, with inline module definitions.
  • ModuleWorker: Add a small abstraction over Workers, modules, and postMessage to make it easier to access and invoke exports of a worker module.
  • WorkerFunction: Add a very small amount sugar over ModuleWorker to make it easier to create a local function that invokes an exported function in a ModuleWorker.

And in addition to the identified blöcks use cases, inline modules address a few others.

Use Cases

Workers and Worklets

Ergonomic creations of workers directly from within a containing module.

Async Module Initialization

Let modules define initialization code that runs before the module body, and possibly before the module's imports, as safer alternative to top-level await for async module initialization.

Settings for imports

The pattern applying settings for a module before importing it requieres multiple files because all imports finish before statements evaluate. Inline modules could be used to evaluate some statements before imports.

Bundling

Allow multiple modules to be defined in one file as a light-weight bundling mechanism.

Testing

Testing code that deals in modules as first-class objects often requires creating many small test files that make tests harder to read and write.

Overview

Inline Modules

Inline modules would add a new syntax for defining a module within another module.

The syntax, borrowed from blöcks, is intended to indicate that the inline module definition is not a nested lexical scope, but isolated from its containing scope:

const y = 0;
const moduleDefinition = {|
  import * as foo from 'foo';

  export let x = 1;

  // ... other code here as usual ...

  console.log(y); // Error, not defined.
|};

A bare inline module definition does not create a new module immediately. This is because we don't yet know whether the module should be created in the same realm, or in a worker. A module has to be instantiated in some way from an inline module definition.

Modules instantiated from inline modules have static imports and exports like any other module.

Inline modules cannot capture variables from outer scopes.

ModuleWorker

ModuleWorker is a worker created from a module that exposes the module's exports on a .module property, and allows access to those exports wihtout the direct use of postMessage():

const worker = new ModuleWorker('module.js');
const mod = await worker.module;
console.log(await mod.x);
const f = await mod.f;
console.log(await f(a, b, c));

The ModuleWorker#module is a Promise of an asynchronous module object. All exported properties of the module are available as Promises.

Exports transfered to the host module via postMessage and an algorithm that make it easier to use of transferrables and functions. This algorithm is not a full proxy system like ComLink - it's just attempting to make existing structured close and transferrable semantics a bit easier to use.

  • If the export is transferrable, send it in the transfer list of postMessage
  • If the export is not transferrable, send it as the data argument of postMessage, which uses the structured clone algorithm.
  • If a DATA_CLONE_ERR exception is thrown, resend a message representing the error, and reject the Promise in the host module.

No special handing is added for functions, objects or classes of any type. This keeps the system simple and much closer to postMessage semantics. Since exports are static and named we don't have to keep any tables of proxies and references and don't have to try to proxy object instances, synchronous methods, or callbacks - the difficult things that libraries like ComLink do.

Invoking functions is critical functionality though, so we add the ability to invoke only exported functions with the ModuleWorker#call() method.

console.log(await mod.call('f', a, b, c));

This makes it slightly easier to create callable function from modules:

const f = (...args) => mod.call('f', ...args);

But more on that later...

Library code can work ontop of ModuleWorker to implement ComLink-style abstractions via postMessage. Implicit, but not defined yet in this proposal is a protocol for getting values of exports and invoking functions that libraries can implement or intercept on both the host and worker side.

WorkerFunction

Blöcks propses fairly nice ergonomics for creating a worker that exports one callable function:

const result = await worker<endpoint>{|
  const res = await fetch(endpoint);
  const json = await res.json();

  return json[2].firstName;
|};

In particular:

  • Blöcks are just a function body with a return statement.
  • Blöcks can close over variables with a special syntax.
  • Workers created from a blöck return ready-made a Promise-returning function.

Inline modules require a bit more boilerplate to get to a callable function:

const mod = new ModuleWorker({|
  export const f = (endpoint) => {
    const res = await fetch(endpoint);
    const json = await res.json();

    return json[2].firstName;
  };
|});
const f = () => mod.call('f', endpoint);
const result = await f();

Closing over variables is unlikely to be a component of an inline module proposal, but we can sugar over a bit of the host-side setup:

WorkerFunction is a small wrapper around ModuleWorker, and subclass of Function, that creates a callable Promise-returning function out of one export of a worker module.

It doesn't offer closing over variables, but the needed objects can just be passed as parameters:

const result = await new WorkerFunction({|
  export default const f = (endpoint) => {
    const res = await fetch(endpoint);
    const json = await res.json();

    return json[2].firstName;
  };
|})(endpoint);

If the separation of parameters and closed-over variables is important to obtain a function of the correct signature in the host module, we can just wrap the WorkerFunction:

const f = () => new WorkerFunction({|
  export default const f = (endpoint) => {
    const res = await fetch(endpoint);
    const json = await res.json();

    return json[2].firstName;
  };
|})(endpoint);
const result = await f();

With the layering suggested here, this works with external modules as well:

const f = () => new WorkerFunction('f.js')(endpoint);
const result = await f();

Use with dynamic import()

There may be cases where you want to use a module in a worker or withing the same thread as the host module depending on some conditions. import() can be extended to accept inline module definitions:

const mod = await import({|
  import * as foo from 'foo';
  export let x = 1;
|});

The returned module is not asynchronous like a ModuleWorker, but a small wrapper can provide a similar async interface.

const definition = {|
  export const square = (x) => x**2;
|};
const mod = (useWorker)
    ? new ModuleWorker(definition)
    : new LocalAsyncModule(definition);
for (let x of data) {
  console.log(await mod.call('square', x));
}

Handling Transferrables and Clones

The postMessage API requires a separation of structured clone data and transferrable objects. This makes it somewhat tricky to map to a single argument list of a function if we want to support a mixed list of clonable and transferable arguments.

ModuleWorker#call could try to automatically map arguments into the message data or transfer list by introspecting the arguments:

class ModuleWorker {
  call(functionName, ...args) {
    const cloneArgs = [];
    const transferArgs [];
    for (const arg of args) {
      if (arg instance of ArrayBuffer ||
          arg instance of MessagePort ||
          arg instance of ImageBitmap) {
        transferArgs.push(arg);
      } else {
        cloneArgs.push(arg);
      }
    }
    this.postMessage({functionName, args}, document.origin, transferArgs);
  }
};

This might be tricky for types that are both clonable and transferrable, like ArrayBuffer, but there could be some signal from the caller about which method to use, like:

const f = (buffer) => mod.call('f', mod.transfer(buffer));

Async module initialization

Async inline modules is in idea to allow modules to do async initialization work that doesn't block the execution of async initialization work of other modules in the graph. This would allow all async initialization in a module graph to start and make progress concurrently.

The idea is described in this top-level await issue, but it can be updated to work with more general inline module definitions:

a.js:

import template from async {|
  import fetchHTML from 'fetch-html';
  export default const template = await fetchHTML('a.html').template;
|};
import './b.js';

b.js:

import template from async {|
  import parseHTML from 'parse-html';
  export default const template = await fetchHTML('b.html').template;
|};

The browser can start executing the async inline modules in a.js and b.js as soon as they are ready, in any order.

Doing similar initialization with top-level await would mean that the fetch in a.js can't start until the fetch in b.js has completed:

a.js:

import fetchHTML from 'fetch-html';
import './b.js';
const template = await fetchHTML('a.html').template;

b.js:

import parseHTML from 'parse-html';
const template = await fetchHTML('b.html').template;

Sync module initialization

Because imported modules are evaluated before statements, it's a bit tricky to do initialization that needs to be used by an import.

Whereas in CommonJS you can do this:

const settings = require('x-settings');
settings.enableThing();
const library = require('x-library');

And the x-library module can use the settings object, the same code in standard JS modules will not work as is. You need to split out the settings into a separate module:

my-settings.js:

import * as settings from 'x-settings';
settings.enableThing();

my-app.js:

import 'my-settings';
import * as library from 'x-library';

This is a bit cumbersom. Inline modules can help:

my-app.js:

import {|
  import * as settings from 'x-settings';
  settings.enableThing();
|};
import * as library from 'x-library';
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.