Skip to content

Instantly share code, notes, and snippets.

What would you like to do?
Command Line Tools and Unix I/O

Intro: Command Line Tools

  • Types of programs: Side-effect-y (mkdir, touch) vs Function-y (cat, grep, wc) -- Produce output
  • Basic Unix "contract": Take in text stream and output text stream
  • Somewhat janky but works fairly well


  • head - show me first lines of a file
head /usr/share/dict/words
  • default: this information appears in your terminal

Pipes and Redirection

  • Default behavior: output goes to your terminal and gets printed
  • But sometimes we want to customize this
  • example: 2 programs head and wc

Print number of words:

head /usr/share/dict/words | wc

Or, lines (with a flag):

head /usr/share/dict/words | wc -l

Command-line Args vs. text input

  • head <filename> vs cat /some/file | head

Redirection Scenarios

  • Send this output to another program
  • Send this output to a file
  • Unix "pipe" -- Send STDOUT from this program to STDIN of the next program
  • Fundamental command-line building block for connecting multiple programs to do useful stuff
  • Somewhat similar to chaining enumerables in ruby, e.g.:
[1,2,3,4].map { |i| i * 2 }.map { |i| i - 1 }.select { |i| i.odd? }

Common "Pipe" Scenarios

  • How long is this input
  • Select lines in this input matching a string or regex
  • Replace a certain text pattern in this input with another


  • Most programs work with 3 standard input / output streams
  • "Standard In": Text that was given to me
  • "Standard Out": Text that I am producing for consumption
  • "Standard Error": A second output stream, but often used for debug messages, warnings, etc

In Ruby, for example

  • $stdin, $stdout, $stderr
  • $stdin is default target of gets
  • $stdout is default target of puts
ruby -e "10.times { puts 'hello' }" | wc -l
ruby -e "10.times { \$stderr.puts 'hello' }" 2> /tmp/my_err_lines.txt
cat /tmp/my_err_lines.txt

Power of piped IO: Stream processing

  • This abstraction lets us process things in small chunks, i.e. "lazily" or "streaming"
  • If written correctly, can allow programs to process large volumes of data which would never fit into memory
  • Common convention is line-based processing: read characters until you hit a newline, process that line, then continue

Unix Tour: Some Handy Tools

  • echo
  • head
  • tail
  • cat
  • sort
  • uniq
  • wc
  • curl
  • jq
  • sed
  • grep
  • pv
  • cut

These can also be combined with custom programs of our own

e.g. processing stdin with ruby:

head /usr/share/dict/words | ruby -e "count = 0; while line = \$stdin.gets;  count += 1; end; puts count"

Executables, Chmod, Shebang, PATH

  • Why can you just type some commands into your shell? (ruby, node, cat, etc)
  • Unix "Permissions": OS's way of managing what actions can be performed to what files
  • Read, Write, or "Execute"
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.