Skip to content

Instantly share code, notes, and snippets.

@bbstilson
Created April 7, 2018 19:32
Show Gist options
  • Save bbstilson/cf0367de97f4e69d619e6bd3ab55a896 to your computer and use it in GitHub Desktop.
Save bbstilson/cf0367de97f4e69d619e6bd3ab55a896 to your computer and use it in GitHub Desktop.

Let's encrypt some files with Node! - Part 1

What we'll be learning

This tutorial is targeted to people who are very familiar with javascript and are at least somewhat familiar with Node.js.

We'll be learning:

  • How to work with Node streams.
  • How to write custom streams.
  • How to use the crypto module.
  • A little bit about AES encryption.

What we'll be building

We're going to build a CLI program which will allow us to compress and encrypt a file using a password, and then decrypt and uncompress that file using that same password. We'll be doing it entirely in Node with no external dependencies.

Overall, the plan is to:

  1. Read some plaintext.
  2. Compress it.
  3. Encrypt it.
  4. Append data used in the encryption process (which is needed for decryption later).
  5. Write the cipher text to a file.

Then, we'll need to reverse those steps:

  1. Read some cipher text.
  2. Pull the encryption data.
  3. Decrypt it.
  4. Uncompress it.
  5. Write the plaintext to a file.

Sound good? Let’s get started.

If you just want to see the source code, it's on github here.

Step 0: Preparing our project

First, let’s create a directory, and, in it, create two files, read.js and file.txt. Our directory should look like this:

.
├── index.js
└── file.txt

In file.txt, let’s put a little bit of text (I used a paragraph from baconipsum):

Spicy jalapeno bacon ipsum dolor amet fugiat fatback ut flank dolor in ea, aute buffalo duis. T-bone occaecat sunt nisi commodo pig. Beef ullamco prosciutto irure cow dolore. Reprehenderit chicken ut, pork chop venison consectetur quis in. Ut pig duis aliqua.

Step 1: Reading Files

Like JS other runtimes, Node runs your code in a single thread. However, thanks to the event loop, all I/O can be ran in parallel, which we can take advantage of by avoiding the syncronous APIs and relying on the asyncronous ones.

For example, if you want to read a file, you can do it syncronously like this:

const fs = require('fs');

const fileContents = fs.readFileSync('./file.txt');

console.log(fileContents);

That totally works, but it's blocking, which is bad. The other way to do it is using a read file stream:

const fs = require('fs');

const readStream = fs.createReadStream('./file.txt');

readStream.on('data', (chunk) => {
  console.log(chunk.toString('utf8'));
});

What this is doing is pretty cool. createReadStream is asyncronously reading a file bit by bit without blocking the rest of the code execution. Currently, though, it's a bit clunky, and we can use a cool feature of streams: piping.

const fs = require('fs');

const readStream = fs.createReadStream('./file.txt');

readStream.pipe(process.stdout);

In Node (unless you change it), console.log writes to process.stdout, and because process.stdout is a stream, we can tell it to print out each chunk of data as it receives it from the read stream.

Neat, now what?

Step 2: Writing Files

Let's expand this code to create a new file. For that, we need a new method: createWriteStream.

const fs = require('fs');

const readStream = fs.createReadStream('./file.txt');
const writeStream = fs.createWriteStream('./newfile.txt');

readStream.on('data', (chunk) => {
  writeStream.write(chunk);
});

readStream.on('close', () => {
  // Be sure to close the write stream!
  writeStream.end();
});

Here, we're calling the write method on the write stream with the chunk of data we read from the read stream. Finally, we're ending stream write stream once we're done reading all the data from the read stream.

But again, this is kind of clunky. Rather than invoking the write method, let's do what we did before and pipe the read stream directly to the write stream!

const fs = require('fs');

const readStream = fs.createReadStream('./file.txt');
const writeStream = fs.createWriteStream('./newfile.txt');

readStream.pipe(writeStream);

Much better.

Piping, besides being more terse, handles both writing to the stream as well as closing, or 'end'ing, the stream.

So now we have a pretty useless program which creates a new file with the exact same data as some other file, but we're headed in the right direction.

Step 3: Compression

Rather than simply writing the same contents to a new file, let's compress that file as we write it. For that, we'll need another Node module: zlib. The zlib has a couple compression and decompression schemes, but the one we're going to use is gzip, which you might be familiar with.

To create a gzip stream in Node, we just need to require the zlib module, then create a gzip stream:

const fs = require('fs');
const zlib = require('zlib');

const readStream = fs.createReadStream('./file.txt');
const gzipStream = zlib.createGzip();
const writeStream = fs.createWriteStream('./newfile.txt');

readStream
  .pipe(gzipStream)
  .pipe(writeStream);

That's it!

Let's checkout newfile.txt:

1f8b 0800 0000 0000 0013 b590 d151 0331
0c44 ffa9 620b 0857 0425 1028 40a7 d325
e26c cbb1 2518 ba47 09b4 c097 343b 3bda
b73a 77e5 6f7c 50a1 2ecd b012 5b83 f619
159b 151b a02a 8e3d 2e4a 39c8 d370 2072
2dd4 8e3f 8b36 089d 40e1 8235 f69d 8a61
0b9d 0bde 9e57 6b02 6326 e13c 30a3 399a
4e05 5bad b619 ba5e 16bc 88ec 8852 a872
2ac3 266b b81b 74c4 90b4 7efd 06c9 8257
e943 aed2 3619 eae0 abf2 212d 794e e836
8e14 ace3 5332 215b 6493 29ec e231 704b
9ce4 5cf0 eef7 c807 1ea8 e82d 68c1 f93f
7ff0 f403 304e c86c 6201 0000 

Cool, binary data. Not super readable, but it is much smaller (~42% smaller)!

354 file.txt
204 newfile.txt

Time to encrypt this bad boy...in part 2.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment