Skip to content

Instantly share code, notes, and snippets.

@ivan-marquez
Created March 10, 2018 16:04
Show Gist options
  • Save ivan-marquez/b5d0a1be002c85d57fe364f9dc589fbd to your computer and use it in GitHub Desktop.
Save ivan-marquez/b5d0a1be002c85d57fe364f9dc589fbd to your computer and use it in GitHub Desktop.
Exploring Node.js streams
const fs = require('fs');
const server = require('http').createServer();
server.on('request', (req, res) => {
/**
* We basically put the whole big.file (~400mb) content in memory before we write it
* out to the response object. This is very inefficient.
* memory used: ~430mb
*/
// fs.readFile('./big.file', (err, data) => {
// if (err) throw err;
// res.end(data);
// });
/**
* The HTTP response object (res in the code above) is also a writable stream.
* This means if we have a readable stream that represents the content of big.file,
* we can just pipe those two on each other and achieve mostly the same result without
* consuming ~400 MB of memory. Node’s fs module can give us a readable stream for any
* file using the createReadStream method. We can pipe that to the response object.
* memory used: ~40mb
*/
const src = fs.createReadStream('./big.file');
src.pipe(res);
});
server.listen(8000);
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment