Skip to content

Instantly share code, notes, and snippets.

@cheeseblubber
Created May 29, 2019 13:54
Show Gist options
  • Save cheeseblubber/f71bb258ec0baddf6f5f4f2b0c4dcefe to your computer and use it in GitHub Desktop.
Save cheeseblubber/f71bb258ec0baddf6f5f4f2b0c4dcefe to your computer and use it in GitHub Desktop.

Bash Explained: Chained Killing

In my previous roles with more devops responsibility I’ve found myself having to kill multiple process with the same name quite frequently and often times I find myself coming back to these chain of commands: ps aux | grep stupid_process | awk '{print $2}' | xargs sudo kill -9

Now in my development roles I’ve often found myself still leveraging the bash skills I’ve learned quite frequently in my day to day. I want to break down this command down in to its individual pieces and explain how they might be used in other scenarios. So hopefully next time you are aggressively stack overflowing a problem and sees a long chain of bash you can do the same.

So lets start with ps aux

ps returns a list of current running processes where aux is 3 separate flags that shows every process running on the system.

which would look something like this:

USER       PID %CPU %MEM    VSZ   RSS TTY      STAT START   TIME COMMAND
root         1  0.0  0.2 225400  9408 ?        Ss    2018  16:19 /lib/systemd/systemd --system --deserialize 40
root         2  0.0  0.0      0     0 ?        S     2018   0:00 [kthreadd]
root         4  0.0  0.0      0     0 ?        I<    2018   0:00 [kworker/0:0H]
root         6  0.0  0.0      0     0 ?        I<    2018   0:00 [mm_percpu_wq]
root         7  0.0  0.0      0     0 ?        S     2018   2:30 [ksoftirqd/0]
root         8  0.0  0.0      0     0 ?        I     2018   6:46 [rcu_sched]
root         9  0.0  0.0      0     0 ?        I     2018   0:00 [rcu_bh]
root        10  0.0  0.0      0     0 ?        S     2018   0:15 [migration/0]
root      6361  0.0  0.0      0     0 ?        I     2018   6:46 /lib/stupid_process
root      6362  0.0  0.0      0     0 ?        I     2018   6:46 /lib/stupid_process
root      6363  0.0  0.0      0     0 ?        I     2018   6:46 /lib/stupid_process
root      6364  0.0  0.0      0     0 ?        I     2018   6:46 /lib/stupid_process
root      6365  0.0  0.0      0     0 ?        I     2018   6:46 /lib/stupid_process
root      6366  0.0  0.0      0     0 ?        I     2018   6:46 /lib/stupid_process

In my normal day to day I use this to see if another node process is running and if it is interfering with my integration test.

Next we want to find only the process that we want to kill which brings us to:

grep stupid_process

grep is a really simple utility where it searches for the string you provide. By passing the output of ps aux in to grep we can search the output for only lines that contain the string stupid_process

The output of ps aux | grep stupid_process would look something like this:

root      6361  0.0  0.0      0     0 ?        I     2018   6:46 /lib/stupid_process
root      6362  0.0  0.0      0     0 ?        I     2018   6:46 /lib/stupid_process
root      6363  0.0  0.0      0     0 ?        I     2018   6:46 /lib/stupid_process
root      6364  0.0  0.0      0     0 ?        I     2018   6:46 /lib/stupid_process
root      6365  0.0  0.0      0     0 ?        I     2018   6:46 /lib/stupid_process
root      6366  0.0  0.0      0     0 ?        I     2018   6:46 /lib/stupid_process

I also use grep quite frequently in combination with cat for csv files or log files. For example I would run cat user_payments.csv | grep premium_members to return a list of all the premium members. Another common use case is running ps aux | grep node although there is a command called pgrep that accomplishes the same thing

Next is awk '{print $2}'.

awk is a command that can scan the inputs line by line as well as separate columns. In our process killing use case we want to fetch the second column of the grep output. This will give us a list of process ids that we can send to kill next.

The output of ps aux | grep stupid_process | awk '{print $2}``' would look something like this:

6361
6362
6363
6364
6365
6366

I find awk incredibly useful whenever I have to do any data analytics with csv. By default awk split lines in to column by spaces but you can specify which delimiter you want to use. awk -F',' '{print $3}' myfile.csv In this example you would fetch the third column from the myfile.csv. I will often chain awk together with commands like uniq and wc to get a count of uniq values from a column with something like this awk -F',' '{print $3}' myfile.csv | uniq | wc -l. Which would return the count of uniq values in the third column of myfile.csv

Finally we have xargs sudo kill -9

We have a list of process ids you want to kill, we could write a for loop but xargs makes this much simpler. xargs will can take a list of inputs either separated by newlines and run a command for each line. I like to think of it as a forEach function in many languages. In our scenario we are taking a list of process id and iterating over them and calling sudo kill -9 proccessId for each one.

Another example of where I have used xargs is when processing files. For example during a typescript migration I had to append a line on top of a large number of files find . -name *.ts | xargs gsed 'do some thing to the file'

Of course you can also kill processes with pkill stupid_process as well which sends a kill command to all the processes that matches a pattern. The advantage of doing this chain of command is that you can see what you are doing at each step to make sure that you don’t accidentally kill something that you are not supposed to.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment