Skip to content

Instantly share code, notes, and snippets.

@ryzokuken
Created May 23, 2018 05:11
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save ryzokuken/71392a6cc0a962b5c0ea0662e8a3ae6a to your computer and use it in GitHub Desktop.
Save ryzokuken/71392a6cc0a962b5c0ea0662e8a3ae6a to your computer and use it in GitHub Desktop.
A small gist that attempts to reproduce nodejs/node#20824

This is a small example that attemps to reproduce the error caused at nodejs/node#20824, possibly due to a race condition.

It seems like there are race conditions between the StreamPipe mechanism and HTTP/2 shutting down; on the HTTP/2 side, it might be as easy as turning the failing !this->IsDestroyed() check into an early return.

On the FileHandle side, it would be good to get a proper JS stack trace for the site of the assertion.

For a full-fledged example closer to a real-world application, visit https://github.com/budarin/http2-test-project

Includes:

  • A simple HTTP/2 server that causes such a failure (server.js)
  • A sample client, others should work (client.js)
  • A sample file, any file should work (hello.txt)

Rationale:

Because it's a race condition, it is very hard to reliably reproduce this in a small amount of respondWithFile(...) calls. As @budarin points out, uncommenting the following lines from their code makes it easier to get a crash within a reasonable number of reloads.

but if to push something that is not using on the page - the next page refresh will close the http2 stream

however, it's not the correct reason for the crash, but just that increasing the number of files being sent increases the chance of the race condition failing on us.

In the code above, the loop is made to for 100 times, pushing the file as many times to the stream. 100 was chosen because it's big enough to cause a failure in a reasonable number of runs of the client (1-5) while still not having a high enough probability of the server failing in a single client run, thus being the Goldilocks value we needed in this case.

const http2 = require('http2');
const fs = require('fs');
const client = http2.connect('https://localhost:1337', {
ca: fs.readFileSync('localhost-cert.pem')
});
client.on('error', err => console.error(err));
const req = client.request({ ':path': '/' });
req.on('response', (headers, flags) => {
for (const name in headers) {
console.log(`${name}: ${headers[name]}`);
}
});
req.setEncoding('utf8');
let data = '';
req.on('data', chunk => {
data += chunk;
});
req.on('end', () => {
console.log(`\n${data}`);
client.close();
});
req.end();
hello world
const http2 = require('http2');
const fs = require('fs');
const server = http2.createSecureServer({
key: fs.readFileSync('localhost-privkey.pem'),
cert: fs.readFileSync('localhost-cert.pem')
});
function pushStream(stream, data) {
stream.pushStream(
{ ':path': '/' },
{ parent: stream.id },
(err, pushStream, headers) => {
if (err) throw err;
pushStream.respondWithFile(`${__dirname}/hello.txt`, {
'content-type': 'text/plain'
});
}
);
}
server.on('stream', stream => {
stream.respond({ ':status': 200 });
for (let i = 0; i < 100; i++) {
pushStream(stream, i);
}
stream.end();
});
server.listen(1337);
@budarin
Copy link

budarin commented May 23, 2018

Hi.
I would add a few iterations of the call on the client.

I think here's what - cmd+R in the browser breaks the current connection and creates a new one.
I had a problem when there was a fast multiple page refresh

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment