Skip to content

Instantly share code, notes, and snippets.

@trevnorris
Last active August 29, 2015 14:17
Show Gist options
  • Save trevnorris/920e993a3a21d3443363 to your computer and use it in GitHub Desktop.
Save trevnorris/920e993a3a21d3443363 to your computer and use it in GitHub Desktop.

In the following example data is being read from a file, undergoes a transform then is written to a port. How are the following things handled?

  1. If attempting to write to the port results in an error, how do we let the file know it's time to close?

  2. If the file is deleted under our feet, how do we let the port know to close?

  3. If the transform has a parsing error how does it alert both the file and the port they need to close and cleanup?

readableFile.pipeThrough(ts).pipeTo(writablePort);

Injecting self into the streaming logic is simple when using basic event emitters. All that has to be done is override the existing callback with a new one and then pass the data down the chain. How is this possible with Streams?

// "normal" setup
server.onconnection(function(c) {
  c.ondata(onData);
});

// override script
var oldOnData = c.ondata();
c.ondata(function(chunk) {
  // do stuff

  oldOnData.call(this, chunk);
});

This is useful for the real world use case where people want to intercept the incoming data from the client to decrypt cookies before the data reaches the http parser.

Say for example the pipeline has already been setup like so:

connectionData.pipeThrough(headerParser).pipeTo(receiver);

But I want to inject another transform stream into the pipe. How can I do that?


I worry about the performance of these things. Promises are still so very slow in V8, and I don't know of anything to get them working more quickly any time in the near future. I don't like to code for what may be fast tomorrow, but what is fast today, and Streams (the interface for peripheral access) is too important to be slow while we wait for VMs to improve.

Gist with performance examples: https://gist.github.com/trevnorris/4a8b6dd856cf3e1b4268


Along the lines of performance, it's impossible with the current standard, or even Promises, to completely flatten function declarations. Say for example you want to determine what pipeline to take in the start callback.

let rs = new ReadableStream({
  start: function(enqueue, close, error) {
    // Have to do this asynchronously because "rs" isn't available yet.
    process.nextTick(function() {
      if (conditional)
        rs.pipeTo(ws1);
      else
        rs.pipeTo(ws2);
    });
  }
});

Now the above could be un-nested if nextTick() accepted arguments, but the call would still have to be async to deal with variable scope.


It doesn't sit well with me that the Promise spec has to be extended to support all the needs for streams (e.g. CancellablePromise()). To me that plainly shows the wrong tool is being used for the job. Promises perform what they're meant for well, but the need to continually expand their functionality should throw a red flag.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment