Skip to content

Instantly share code, notes, and snippets.

@techlab23
Last active January 25, 2024 06:16
Show Gist options
  • Save techlab23/691c8c5c95992d1b82c1d604b273844b to your computer and use it in GitHub Desktop.
Save techlab23/691c8c5c95992d1b82c1d604b273844b to your computer and use it in GitHub Desktop.
Memory efficient large CSV processing
import { createReadStream } from 'node:fs';
import { parse } from 'csv-parse';
async function efficientReadAndParse() {
const readStream = createReadStream('VEStdEquip.csv', 'utf-8').pipe(parse({ delimiter: ',', columns: ['VehicleKey', 'EquipDescription'] }));
for await (const data of readStream) {
await processData(data);
}
console.log('done');
}
function delay(ms: number) {
return new Promise((resolve) => setTimeout(resolve, ms));
}
async function processData(data: string) {
await delay(1000);
console.log(data);
}
efficientReadAndParse();
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment