Skip to content

Instantly share code, notes, and snippets.

@kristopherjohnson
Last active May 14, 2024 23:13
Show Gist options
  • Save kristopherjohnson/5065599 to your computer and use it in GitHub Desktop.
Save kristopherjohnson/5065599 to your computer and use it in GitHub Desktop.
Read JSON from standard input and writes formatted JSON to standard output. Requires Node.js.
#!/usr/bin/env node
// Reads JSON from stdin and writes equivalent
// nicely-formatted JSON to stdout.
var stdin = process.stdin,
stdout = process.stdout,
inputChunks = [];
stdin.resume();
stdin.setEncoding('utf8');
stdin.on('data', function (chunk) {
inputChunks.push(chunk);
});
stdin.on('end', function () {
var inputJSON = inputChunks.join(),
parsedData = JSON.parse(inputJSON),
outputJSON = JSON.stringify(parsedData, null, ' ');
stdout.write(outputJSON);
stdout.write('\n');
});
@kristopherjohnson
Copy link
Author

@mimno
Copy link

mimno commented Jun 11, 2014

Array.join() with no arguments adds commas between elements, which corrupts the input. inputChunks.join("") works.

@sarnobat
Copy link

Perfect. Exactly what I was looking for

@craigsapp
Copy link

Thanks for this gist. I learned this while studying it:

As a Stream, process.stdin can also be used in "old" mode that is compatible with scripts written for node prior v0.10. For more information see Stream compatibility.

In "old" Streams mode the stdin stream is paused by default, so one must call process.stdin.resume() to read from it. Note also that calling process.stdin.resume() itself would switch stream to "old" mode.

If you are starting a new project you should prefer a more recent "new" Streams mode over "old" one.

@Roam-Cooper
Copy link

Roam-Cooper commented May 26, 2016

Make inputChunks a string.

stdin.on('data', function (chunk) {
    inputChunks += chunk;
});

Then you don't need to join.

@noway
Copy link

noway commented Jan 4, 2017

WARNING: it inserts "," between the chunks.

to fix this replace

    var inputJSON = inputChunks.join(),

with

    var inputJSON = inputChunks.join(""),

@robinsax
Copy link

robinsax commented Jun 2, 2018

@Roam-Cooper, I think the join approach is better as it potentially optimizes the concatenation.

@solex16
Copy link

solex16 commented May 19, 2021

Using the double quotes in the join()
also fixed this error I was getting:

'SyntaxError: Unexpected token , in JSON at position 65536'

@hasantayyar
Copy link

Nodejs readline native utility can be helpful to split the lines if you are streaming line by line.

@newbie-lad
Copy link

'is what i am saying

@tomsaleeba
Copy link

tomsaleeba commented Jul 5, 2023

Thanks for gist 💪.

Here's a terser version that uses some of the included fixes:

let inputJson = ''
process.stdin.setEncoding('utf8')
process.stdin.on('data', function (chunk) {
  inputJson += chunk
})
process.stdin.on('end', function () {
  const parsedData = JSON.parse(inputJson)
  process.stdout.write(JSON.stringify(parsedData))
})

@aellerton
Copy link

Yet another updated and terser version, ready to edit and run!

function readStdin() {
  return new Promise(resolve => {
    let buf = ''
    process.stdin.setEncoding('utf8')
    process.stdin.on('data', chunk => (buf += chunk))
    process.stdin.on('end', () => resolve(buf))
  })
}

async function main() {
  const d = JSON.parse(await readStdin())
  // do stuff with d, then...
  console.log(JSON.stringify(d, null, 2))
}

@danthegoodman
Copy link

Here's a oneliner that does the same thing (at least in mac and linux land, not sure about windows):

node -p 'JSON.stringify(JSON.parse(fs.readFileSync(0)),null,2)'

notably, JSON.parse can work off of a buffer and fs.readFileSync(0) reads the zero file descriptor, which is standard input.
Then, node -p is a way to execute and log the output from a statement. You could also write it with a node -e 'console.log(...)' if you would rather be in control of when or how the logging happens.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment