Skip to content

Instantly share code, notes, and snippets.

Embed
What would you like to do?
A cool map style paradigm to use in the shell for parallel execution and logging
# This shows a bit of shell muscle power
# First, we call find with maxdepth 1. This will make it so that find
# only searches within this current directory
# Then we use -type f to make sure find is only getting files
# then we use \! -name 'config.json'.
# Together, these flags mean that we will get only
# files (not directories, etc.) that are not named `config.json`
# in this directory.
# Thanks to Ike Levy for helping me figure out this badassery
# Then we pipe that to xargs. We use -I to get a hold of the
# argument. So we are essentially now using a command on each file
# in this directory, with the filename in the xargs call as `file`.
# We then use `-P` to say that we want to run 12 (or whatever)
# concurrent processes. Then, we call `sh -c` as the actual command
# We can use `file` in the string argument to `sh`. Here we call
# `application`, with some flags, and with the `-f` flag taking `file`,
# in this case the files in our directory.
# Finally, we're able to pipe the output of that activity
# to `file.log`, which will actually
# redirect the output of running `application`
# on `file` to a file named after the original
# file with `.log` appended to it.
# So if you're in a directory with a file named
# `foo.bar` and `bar.baz`, this will call `../application` on each of them
# in the `-f` flag position
# and redirect the output of calling `../application` on each of them to
# `foo.bar.log` and `bar.baz.log` respectively.
find . -maxdepth 1 -type f \! -name 'config.json' | xargs -I file -P 12 sh -c '../application -f file -o option -vv > file.log'
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment