Skip to content

Instantly share code, notes, and snippets.

@dotpyfe
Created September 6, 2017 20:47
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save dotpyfe/a533a2ff685f148a40c0e6d9ce8d2114 to your computer and use it in GitHub Desktop.
Save dotpyfe/a533a2ff685f148a40c0e6d9ce8d2114 to your computer and use it in GitHub Desktop.

%title: CLI - Get to know your Command Line %author: Maciek Swiech, PhD %date: 2017-09-05

-> CLI (Command Line Interface) <-

-> Living in the Command Line, with Dr Masquatch <-


-> Agenda to Cover <-

  • Shells!
  • Directory Navigation
  • GNU coreutils & others
  • Combining with pipes
  • CLI editors
  • Bash Scripting
  • Process Management
  • Remote Connections

-> Agenda to Cover <-

  • Shells!
    • What are they?
    • Prompt

-> # Shells <-

When you open a terminal, you are greeted with a friendly prompt:

$ _ 

This is where you are able to interact with whatever shell you are using.


-> # Shells <-

When you open a terminal, you are greeted with a friendly prompt:

$ _ 

This is where you are able to interact with whatever shell you are using.

E.g.

  • sh (the 'original')
  • bash (Bourne again shell)
  • zsh (z shell)
  • csh (c shell)
  • fish
  • ... many more

-> # Shell Prompt <-

The symbol preceding the prompt is actually giving you some information!

$ _

Is the prompt you expect to see when you are logged in as a normal (non-privileged) user.

What about for a privileged user?


-> # Shell Prompt <-

The symbol preceding the prompt is actually giving you some information!

$ _

Is the prompt you expect to see when you are logged in as a normal (non-privileged) user.

# _

If you see this prompt, you are logged in as the root user - take care here, as you have nigh unlimited access to the system, and can change a lot!


-> Agenda to Cover <-

  • Directory Navigation!
    • Common commands
    • Permissions
    • Relative vs Absolute Paths
    • Navigating Back and Forth

-> # Directory Navigation <-

The most commonly used commands for navigating directory structures in the command line are:

  • ls (list)
  • cd (change directory)
  • mkdir (make a directory)
  • rm (remove)
  • mv (move)
  • cp (copy)

-> # Directory Navigation <-

The most commonly used commands for navigating directory structures in the command line are:

  • ls (list)
  • cd (change directory)
  • mkdir (make a directory)
  • rm (remove)
  • mv (move)
  • cp (copy)

Lets take a look at the output of ls:

$ ls -l
total 8
-rw-------  1 maciekswiech  staff     0 Sep  1 15:57 backup_text.txt
-rw-------  1 maciekswiech  staff  1373 Sep  1 15:54 cli.md

-> # Directory Navigation <-

The most commonly used commands for navigating directory structures in the command line are:

  • ls (list)
  • cd (change directory)
  • mkdir (make a directory)
  • rm (remove)
  • mv (move)
  • cp (copy)

Lets take a look at the output of ls:

$ ls -l
total 8
-rw-------  1 maciekswiech  staff     0 Sep  1 15:57 backup_text.txt
-rw-------  1 maciekswiech  staff  1373 Sep  1 15:54 cli.md

What do those letters to the side mean? they are permissions, by group! ▛▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▜ ▌ ▐ Owner▐ Group▐ World▐ r: read ▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀ w: write ▌ directory bit▐ r w x▐ r w x▐ r w x▐ x: execute ▙▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▟


-> # File Permissions <-

▛▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▜ ▌ ▐ Owner▐ Group▐ World▐ r: read ▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀ w: write ▌ directory bit▐ r w x▐ r w x▐ r w x▐ x: execute ▙▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▟

These permissions can be thought of as binary, and used in the chmod command. For example, if the owner can read/write, and all others can read, this looks like:

rw_ r__ r__

Or, in binary:

110 100 100

So you could achieve this by saying:

$ chmod 644 backup_text.txt

-> # File Permissions <-

▛▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▜ ▌ ▐ Owner▐ Group▐ World▐ r: read ▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀ w: write ▌ directory bit▐ r w x▐ r w x▐ r w x▐ x: execute ▙▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▄▟

You don't have to use binary representations, you can directly set specific traits that you want using the letters.

If you were writing a file you'd like to run, you have to make it executable!

$ ls -l my_script.sh
-rw-------  1 maciekswiech  staff  31 Sep  1 20:19 my_script.sh
$ chmod +x my_script.sh
$ ls -l my_script.sh
-rwx------  1 maciekswiech  staff  31 Sep  1 20:19 my_script.sh

-> # Relative vs. Absolute Paths

File paths can be specified in one of two ways:

  • Absolute
    • The full path of the file
      • e.g. /var/log/nginx/errors.log
      • This path is 'absolute' because it doesn't matter where you are
  • Relative
    • Relative to your current working directory (aka cwd/pwd)
      • e.g. ~/Desktop/screen_shot.jpg
    • Common UNIX aliases:
      • ~ - maps to the /home/ directory
      • ~user - maps to a specific user, e.g. ~msw978
      • . - maps to the cwd/pwd
      • .. - maps to the parent directory

-> # Navigating Back and Forth <-

OK, so now you're experts in CLI navigation. But can we be faster than just using cd all the time? Sure!

Last directory

Let's say I want to switch back and forth between two directories. Here we can use the cd - command.

$                   # cwd: ~/project/files/
$ cd ../tests       # cwd: ~/project/tests/
$ cd -              # cwd: ~/project/files/
$ cd -              # cwd: ~/project/tests/

Your shell remembers where you were last, so cd - will jump back there


-> # Navigating Back and Forth <-

It turns out that your shell can remember more than one location, it can keep a whole stack of locations!

Pushd/Popd

  • pushd
    • Pushes currect directory on the stack, changes to the specified directory
  • popd
    • Pops directory from the stack, and changes to it
  • dirs
    • Prints the current stack

-> # Navigating Back and Forth <-

It turns out that your shell can remember more than one location, it can keep a whole stack of locations!

Pushd/Popd

$                           # cwd: ~/project/files/
$ pushd ~/other-project     # cwd: ~/other-project/
$ dirs
~/project/files ~
$ pushd /var/log            # cwd: /var/log/
$ dirs
~/other-project ~/project/files ~
$ popd                      # cwd: ~/other-project/
$ dirs
~/project/files ~
$ popd                      # cwd: ~/project/files/
$ dirs
~

-> Agenda to Cover <-

  • GNU coreutils & others
    • echo
    • cat
    • sed
    • awk
    • cut
    • uniq
    • sort
    • join
    • wc
    • grep/ack/ag
    • head/tail

-> # GNU coreutils & others <-

The basic philosophy of UNIX programs is: "do one thing, and do it well". Let's walk through some of the tools that will be found on virtually every UNIX system that you might be logging into.

Don't worry about memorizing everything we'll cover today, you can always look up commands later, even on the command line, using the man (manual) command!

$ man ruby
RUBY(1)                Ruby Programmers Reference Guide                RUBY(1)

NAME
     ruby -- Interpreted object-oriented scripting language

-> # GNU coreutils & others <-

echo - print strings

echo will 'echo' whatever we pass to it back out. Simple. eh?

$ echo hello
hello
$ echo 'hello here are some more words'
hello here are some more words

-> # GNU coreutils & others <-

cat - file printer

cat is perhaps one of the most basic of tools in our toolchest. It will print whatever file we ask it to, and concatenate the output if we choose multiple files.

$ cat a.txt
hello
$ cat b.txt
world
$ cat a.txt b.txt
hello
world

-> # GNU coreutils & others <-

sed - the stream editor

sed allows you to edit a stream (or file). The most common usage, by far, of sed is to use its find and replace functionality. The 's/find/replace' syntax (which strictly speaking comes from the precursor, ed) is very versatile, including using regular expressions.

$ cat c.txt
hello my name is fred
$ sed 's/fred/maciek/' c.txt
hello my name is maciek

One powerful switch to sed is that it can do the file replacement 'inline'! Be careful, as this will overwrite your file.

$ sed -i 's/fred/maciek/' c.txt
$ cat c.txt
hello my name is maciek

-> # GNU coreutils & others <-

sed - the stream editor

sed allows you to edit a stream (or file). The most common usage, by far, of sed is to use its find and replace functionality. The 's/find/replace' syntax (which strictly speaking comes from the precursor, ed) is very versatile, including using regular expressions.

$ cat c.txt
hello my name is fred
$ sed 's/fred/maciek/' c.txt
hello my name is maciek

One powerful switch to sed is that it can do the file replacement 'inline'! Be careful, as this will overwrite your file.

$ sed -i 's/fred/maciek/' c.txt
$ cat c.txt
hello my name is maciek

PSA: on MacOS, coreutils are actually their BSD counterparts. Most of the time they are drop-in replacemnts, but in the case of sed you have to pass an extension to the in-place sed command, so you would have to do

sed -i '' 's/fred/maciek/' c.txt

-> # GNU coreutils & others <-

sed - the stream editor

You can, of course, get much more powerful substitutions with regular expressions

$ cat d.txt
hello
-my-
-name
-is-
antonio
$ sed 's/-\(.*\)-/+\1+/' d.txt
hello
+my+
-name
+is+
antonio

-> # GNU coreutils & others <-

cut - cut out portions of each line

cut lets us cut out specifc portions of each line of a file or stream.

  • -d the delimiter to use for fields
  • -f the list of fields to print out
$ cat e.txt
a,b,c
d,e,f
g,h,i
j,k,l
$ cut -d , -f 1,3 e.txt
a,c
d,f
g,i
j,l

-> # GNU coreutils & others <-

awk - pattern-directed scanning and processing language

awK is a hugely powerful language, that processes files and streams. The only part we will cover today is using it as a slightly nicer version of cut

$ cat e.txt
a,b,c
d,e,f
g,h,i
j,k,l
$ awk -F, '{print $1,$3}' e.txt
a c
d f
g i
j l

-> # GNU coreutils & others <-

uniq - Find unique values

uniq will take a stream or file, and print out uniq values as it encounters them. One caveat is that the file must be sorted for this to work correctly.

$ cat f.txt
a
a
b
c
c
$ uniq f.txt
a
b
c
$ cat g.txt
a
b
a
$ uniq g.txt
a
b
a

-> # GNU coreutils & others <-

sort - sort some stuff!

sort is.. well.. pretty self explanatory. Sort a file.

$ cat g.txt
a
b
a
$ sort g.txt
a
a
b

-> # GNU coreutils & others <-

join - yep, relational joins

join will perform an inner join between two files or streams, defaulting to using the first field as the join column.

$ cat left.txt
a,1,10
b,2,20
c,3,30
$ cat right.txt
a,a,aa
c,c,cc
d,d,dd
$ join -t, left.txt right.txt
a,1,10,a,aa
c,3,30,c,cc

-> # GNU coreutils & others <-

wc - word count

wc will count up the characters, words, and lines in a file (or just the part you ask it for). It can also count multiple files.

$ wc some_big_file.txt
591932 1591934 48686034 some_big_file.txt
$ wc some_big_file.txt some_small_file.txt
 1591932 1591934 48686034 some_big_file.txt
        2      20     161 some_small_file.txt
         1591934 1591954 48686195 total
$ wc -l some_big_file.txt        # just show the line count
1591932

-> # GNU coreutils & others <-

head/tail - grab only parts of a file

If you want to only look at a part of a file, you can use head and tail to print the beginning or end of that file. By default, the first or last 10 lines are printed, but this can be configured.

$ head -n 3 some_big_file.txt
abc,123
def,456
ghi,789
$ tail -n 1 some_big_file.txt
ZZZ,119

Another common usage of tail is to 'follow' a file that is being update in real time.

$ tail -f /var/log/nginx/error.log
some
last
lines

-> # GNU coreutils & others <-

head/tail - grab only parts of a file

If you want to only look at a part of a file, you can use head and tail to print the beginning or end of that file. By default, the first or last 10 lines are printed, but this can be configured.

$ head -n 3 some_big_file.txt
abc,123
def,456
ghi,789
$ tail -n 1 some_big_file.txt
ZZZ,119

Another common usage of tail is to 'follow' a file that is being update in real time.

$ tail -f /var/log/nginx/error.log
some
last
lines

...

new line     <--- this appeared while we were running the command

-> # GNU coreutils & others <-

grep/ack/ag - search a file

Sometimes you need to find where in a file, or where in a set of files a certain word or pattern occurs. UNIX once again comes to the help with a multitide of tools. grep is a good tool to know because you'll be able to find it on pretty much any environment.

$ grep end app/controllers/backend_controller.rb
class BackendController < ApplicationController
$ grep -ni end app/controllers/backend_controller.rb   # n: show line number
                                                       # i: case insensitive
3:class BackendController < ApplicationController
5:class EndController < ApplicationController
$ grep -rni end .                                      # r: search recursively
./app/models/customer_bonus.rb:42:      cn.status = :pending
./app/models/customer_bonus.rb:44:      cn.message_key = 'notification_messaging.bonus_pending'

-> Agenda to Cover <-

  • Combining with pipes
    • Standard fd's
    • Pipes
    • File redirection

-> # Pipes and Redirection <-

OK, so now we have all these great tools that can do one thing. But it turns out that the real power of the command line is that you can combine them arbitrarily to perform complex operations on files and streams.

Each process in linux has 3 standard file descriptors, or streams where data can flow in and out

  • STDIN - this is where the process gets input, often from the comand line directly
  • STDOUT - this is there the process prints its 'normal' output
  • STDERR - this is where the process prints its errors

-> # Pipes and Redirection <-

Pipes

By using the | (pipe) operator, we can 'connect' the STDOUT of one process to the STDIN of another process. We can do this any number of times that we like.

Let's think back to uniq, it has to get the input sorted in order to work. We can now chain operations together!

$ cat g.txt
a
b
a
$ cat g.txt | sort
a
a
b
$ cat g.txt | sort | uniq
a
b

-> # Pipes and Redirection <-

Redirection

You can also redirect the output (STDOUT) of a process to a file using the > operator, or append to it with the >> operator. Redirecting to an existing file will overwrite whatever was there before, so take care.

$ some_command > log.txt
$ another_command >> log.txt

What if you want to redirect STDERR to a file? The file desriptor number for this is 2.

$ some_command 2> error.log        # STDERR writes to a file
$ some_command 2> error.log > log  # STDERR and STDOUT write to different files
$ some_command 2>&1 > all.log      # STDERR and STDOUT write to the same file
$ some_command &> error.log        # STDERR and STDOUT write to the same file

-> # Pipes and Redirection <-

Redirection

Another very useful tool here is called tee. It is called that because you can conceptually think of it as a 'branch' instead of a 'pipe' -- it both writes STDOUT contents to a file AND to the console

$ some_command | tee out.txt
the results
of the command

-> Agenda to Cover <-

  • CLI editors
    • nano
    • vim / emacs

-> # CLI Editors <-

Most of the time you are at your machine, you can use a full fledged IDE or other code editor.

But sometimes you have to connect to a remote machine, or just do something quick one-off locally that you maybe dont want to open up in your editor, especially if you have a specific project flow that a file doesn't fit into.


-> # CLI Editors <-

Nano

Perhaps One of the simplest CLI editors is nano. It is called that, well, because it is super small.

Some text I want to write

^G Get Help    ^O WriteOut    ^R Read File   ^Y Prev Page   ^K Cut Text    ^C Cur Pos
^X Exit        ^J Justify     ^W Where Is    ^V Next Page   ^U UnCut Text  ^T To Spell

Nano's interface is one of just typing, and all of the available commands are enumerated at the bottom of your screen. Can be great for one-off changes to a file.


-> # CLI Editors <-

Vim/Emacs

Internet wars have raged over the supremacy of vim vs. emacs. I'm not going to engage here.

$ vimtutor

Will provide you with an interactive way to learn vim.

$ emacs
> control-h t

Will provide you with an interactive way to learn vim.


-> # CLI Editors <-

Vim/Emacs

How the hell do I quit out of these???

  • Vim
    • :q
  • Emacs
    • Control-X Control-C

-> Agenda to Cover <-

  • Bash Scripting
    • Basic syntax
    • Variables
    • If/else
    • Loops
    • Command Output

-> # Bash Scripting <-

Basic Syntax

In your typical bash script, you will want to start the file with a 'hashbang' line, indicating what executable will be running the file

#!/bin/bash

After that, you can just start including the various commands you would like to be run:

#!/bin/bash
cat file.txt
echo "hello" | sort | uniq

You have to make the file executable in order to run it.

$ chmod +x my_bash_script.sh
./my_bash_script.sh

-> # Bash Scripting <-

Variables

Variables in bash are created with the = operator (note the lack of spaces)

#!/bin/bash
my_var='hello my name is inigo montoya'

You can then reference the variable by prefixing it with a $.

#!/bin/bash
my_var='hello my name is inigo montoya'

echo $my_var

In more complex scripts, it is preferred to use ${}

#!/bin/bash
my_var='hello my name is inigo montoya'

echo ${my_var}

-> # Bash Scripting <-

Variables

You can also grab variables from STDIN using the $NUMBER notation.

#!/bin/bash

echo $1

Would behave like:

$ ./my_bash_script.sh "hello to you"
hello to you

Note that without the quotes here, bash would treat this as 3 different args!


-> # Bash Scripting <-

If/else

If/else statements are fairly simple in Bash.

#!/bin/bash
my_var='yes'

if [ "yes" = "$my_var" ]; then     # string interpolation
  echo "var was yes!"
else
  echo "var was not yes!"
fi                                 # yup, backwards if

There are a whole slew of bash conditional statements, refer to online guides.


-> # Bash Scripting <-

Loops

There are also a few different ways to loop in bash, i'll just show a couple of the common ones

#!/bin/bash

for i in $(seq 1 5); do
  echo "The current number is $i"
done
#!/bin/bash

cd my_project
for file in *; do
  echo "Looking at the file $file"
done

-> # Bash Scripting <-

Output of commands

You'll notice that in the previous example, we used an external command, seq, to generate a sequence of numbers.

There are 2 common ways of using the output of commands:

  • Backtics
    • Probably the simpler method, easy to use
    • echo "The output was \cat a.txt``
  • $()
    • More robust, and can even be nested
    • Also easy to paste in Slack ;)
    • echo $(cat $(echo 'a.txt'))

-> Agenda to Cover <-

  • Process Management
    • ps
    • top/htop
    • Backgrounding
    • Killing
    • %[number]

-> # Process Management <-

Every command that you run on the command line (and even out) is contained as a UNIX process. It can be important to be able to interact with processes from the terminal.

ps

The ps tool will allow you to list the various processes running on your machine. It seems everybody has their own favorite incantation, here is one:

ps auxww

Whatever format you choose, you will see a list of files currently running! Even more important, you will be shown the PID, or process identifier.


-> # Process Management <-

top/htop

For a nicer formatted, and live-updating view, you can use the top (default) or htop (extra install) programs. They can show you some nice consolidated info, but at the end of the day you're really getting the same thing.


-> # Process Management <-

Backgrounding

Let's say you want to run a long process, but don't need to be interacting with it directly. You can launch the program to run in the 'background' by appending a & to your command.

$ long_command&

Alternatively, if you forgot to do this, you can always press Control-Z to move the process to the background!

What if you want to get it back? You can use the fg, or foreground command.

If you have multiple backgrounded commands, you can tell fg the PID.


-> # Process Management <-

Killing

What if a process needs to be stopped? Maybe it is hanging, or maybe you just don't need it around anymore. You can KILL it! >:)

$ kill MY_PID

kill will actually send a termination signal to the process. There are more than a dozen of these, but the two most commonly used ones are:

  • SIGTERM
    • signal 15 (usually)
    • sent by default, will 'nicely' terminate the process
  • SIGKILL
    • signal 9
    • will force the process to terminate
    • only use if normal kill is not behaving properly

Using SIGKILL:

$ kill -9 MY_PID
   # or
$ kill -s SIGKILL MY_PID

-> # Process Management <-

%[NUMBER]

Using PIDs can be cumbersome. An easier method, if the job you care about is in the context of your terminal session, is to use job numbers. List the jobs in your session using jobs

$ jobs
[1]    suspended  mdp cli.md
[2]  - suspended  mdp cli.md
[3]  + suspended (signal)  man kill

You can now use the %[NUMBER] syntax instead of PIDs:

$ kill -9 %1

-> Agenda to Cover <-

  • Remote Connections
    • ssh/mosh
    • scp

-> # Remote Connections <-

SSH

Probably everyone will be familiar with this, but if you need to connect to a remote machine, you can do so over ssh

$ ssh my-machine.i.ibotta.com

You can also specify the username:

$ ssh maciek@my-machine.i.ibotta.com

Mosh

If you are moving WiFi connections a lot, you can use mosh in the same way as you would ssh, but it will attempt to keep the connection alive.

You have to have mosh installed on both the client and server side.


-> # Remote Connections <-

SCP

If you need to transfer files between two machines, you can do so using the scp (secure copy) command.

It has the syntax of scp origin destination, so you could copy to a remote place using

$ scp my_file my_machine.i.ibotta.com:

or from a remote location using:

$ scp my_machine.i.ibotta.com:my_file .

The : denotes a remote location, after which you can put a relative or absolute path.


-> # Thanks! <-

-> ## Who has the first question? <-

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment