Skip to content

Instantly share code, notes, and snippets.

@shadowmoose
Last active October 24, 2023 00:36
Show Gist options
  • Save shadowmoose/e1d2dd8e4a65ab363eb2c9668392dd8a to your computer and use it in GitHub Desktop.
Save shadowmoose/e1d2dd8e4a65ab363eb2c9668392dd8a to your computer and use it in GitHub Desktop.
TypeScript: Encrypt and decrypt files as streams, appending to and resuming from arbitrary locations, efficiently.
import crypto from 'crypto';
import fs, {createReadStream, createWriteStream} from 'fs';
import * as stream from "stream";
import {pipeline} from "stream/promises";
/*
File encryption toolkit, for streaming read/write and appending.
Designed primarily for use in writing/reading file ranges, as they may be requested via HTTP -
though this works for any use case where encrypted file writing/reading may be desired.
The output resulting files are always compatible with standard aes-256-cbc decryption,
not requiring this library specifically to read.
All file access is handled efficiently, never reading more than 32 bytes (block_size*2) in order to resume writing or reading.
*/
/**
* Open a file read stream from a decrypted file.
* Accepts an optional positional and limit, which allows the encrypted file to be efficiently read from the given point without requiring a full file read.
* The bytes returned from the stream will be decrypted.
* if `length` is provided, the stream will only return up to the given amount of bytes.
*/
export function decryptedStream(file: string, key: string, iv: string, start: number = 0, length?: number) {
const cKey = Buffer.from(key, 'base64');
const cIv = Buffer.from(iv, 'base64');
const cipherOffset = calcEncBlockOffset(start);
const decStream = crypto.createDecipheriv('aes-256-cbc', cKey, cIv, {});
const fileStream = createReadStream(file, {start: cipherOffset.startAt});
let skip = cipherOffset.discard;
let limit = length||0;
const limiter = new stream.Transform({
transform(chunk, encoding, next){
const send = (buff: Buffer) => {
if (length) {
if (!limit) {
this.end();
return next(null);
}
buff = buff.slice(0, Math.min(limit, buff.length));
limit -= buff.length;
}
next(null, buff);
}
if (chunk.length < skip) {
skip -= chunk.length;
return next(null);
}
if (skip) {
chunk = chunk.slice(skip);
skip = 0;
return send(chunk);
}
send(chunk);
}
});
return {
stream: limiter,
promise: pipeline(fileStream, decStream, limiter)
};
}
/**
* Creates a stream that appends to an encrypted file.
* If the file does not already exist, a new one is created.
* Resuming appending is done efficiently, only reading minimal data from the file.
*/
export async function encryptedAppendStream(file: string, keyStr?: string|null, ivStr?: string|null) {
const key = keyStr ? Buffer.from(keyStr, 'base64') : crypto.randomBytes(32);
const iv = ivStr ? Buffer.from(ivStr, 'base64') : crypto.randomBytes(16);
let preexistingBytes = 0;
const existing = await peekLastFullIv(file, iv);
if (!existing) {
await fs.promises.writeFile(file, '');
}
const encStream = crypto.createCipheriv('aes-256-cbc', key, existing?.lastIv || iv);
const ws = createWriteStream(file, { flags: 'r+', start: existing?.writeFrom || 0 });
const finishPromise = new Promise<void>(r=>ws.on('finish', r));
encStream.pipe(ws, {end: true});
if (existing) {
const decStream = crypto.createDecipheriv('aes-256-cbc', key, existing.lastIv, {});
const raw = Buffer.concat([decStream.update(existing.extraBytes), decStream.final()]);
// console.log("Writing leftover bytes:", existing.extraBytes.length, '->', raw.length);
encStream.write(raw);
preexistingBytes = existing.writeFrom + raw.length;
}
return {
stream: encStream,
key: key.toString('base64'),
iv: iv.toString('base64'),
preexistingBytes,
close: (chunk?: any)=>{encStream.end(chunk); return finishPromise;}
}
}
/**
* Using a known cypher block size, calculate the position the cypher needs to read from in order to succeed.
*
* AES-CBC can pick up reading encrypted files at arbitrary starting points,
* provided that it reads the previous whole block first.
*
* However, the bytes read from the previous block will be junk (and unwanted anyway),
* so this function also indicates how many "junk" bytes from the "startAt" location need to be discarded
* in order to reach the real starting location of the desired data.
*/
function calcEncBlockOffset(startByte: number, blockSize=16) {
const expectedLen = Math.floor(startByte/blockSize) * blockSize;
const startAt = Math.max(0, expectedLen - blockSize);
return {
startAt,
discard: startByte - startAt
};
}
/**
* Returns the position of the last known full block, which will be blocksize*2 back from the end, as the last block may be incomplete.
*/
async function calcLastFullIvPosition(file: string, blockSize=16) {
const realSize = (await fs.promises.stat(file)).size;
return Math.floor(realSize/blockSize)*blockSize-(blockSize*2);
}
/**
* Fetch the last IV that is completely intact, and return it.
*
* Also returns any lingering bytes written after the complete IV,
* which indicate a partial block that should be decoded and re-written before any new data.
* This partial data comes from a block that may be incomplete, and thus should never be longer than the blockSize.
*
* Finally, this function also returns the position that a writer should begin from.
* The writer should decode the given `extraBytes`, write them, and then continue with writing any new data to be appended.
*/
async function peekLastFullIv(file: string, firstIv: Buffer, blockSize=16) {
try {
let useBase = false;
let start = await calcLastFullIvPosition(file, blockSize);
if (start < 0) {
start = 0;
useBase = true;
}
const buff = await new Promise<Buffer>(async (res) => {
const stream = fs.createReadStream(file, {start});
const chunks: Buffer[] = [];
for await (const chunk of stream) {
chunks.push(chunk);
}
res(Buffer.concat(chunks));
});
if (useBase) {
return {
writeFrom: 0,
lastIv: firstIv,
extraBytes: buff
}
}
return {
writeFrom: start + blockSize,
lastIv: buff.slice(0, blockSize),
extraBytes: buff.slice(blockSize)
}
} catch (err) {
return null;
}
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment