Skip to content

Instantly share code, notes, and snippets.

@GaryRogers
Last active October 22, 2015 14:49
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save GaryRogers/29223914e9f6d5ea51f7 to your computer and use it in GitHub Desktop.
Save GaryRogers/29223914e9f6d5ea51f7 to your computer and use it in GitHub Desktop.

NodeJS File parsing

Here's a skelleton for ripping files apart in NodeJS and processing each line.

var fs = require('fs');
var zlib = require('zlib');
var stream = require('stream');
var es = require('event-stream');

var return_object = {
  logs: []
};

// Create a readStream object...
fs.createReadStream(filename)
  // Do something when done processing the file.
  .on('end', function(){
    console.log(return_object);
  })
  // If it's a gzip, ungzip it on the fly.
  .pipe(zlib.createGunzip())
  // Split on \n
  .pipe(es.split())
  // Now, do something with each line.
  .pipe(es.map(function(data, callback){
      // in my case I have a line parse function elsewhere in the script.
      my.parse(data, function(error, object){
        if ( error ) {
          console.log(error);
        } else {
          // And I have a filter function as well to rate logs.
          my.filter(object, 70, function(error, log){
            if ( error ) {
              console.log(error);
              console.log(data);
            } else {
              return_object.logs.push(object);
            }
          });
        }
      });
  }));
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment