Working on remote machines
Some (trivial?) commands to work on a remote machine using the SSH, FTP and HTTP protocols.
Generate SSH keys for using a SSH connection without having to enter the password every time you log in.
Generate the key (local machine):
$ cd ~/.ssh $ ssh-keygen -t rsa -C "email@example.com" $ chmod 700 id_rsa
.ssh does not exist create it:
ssh-keygen will create two files with the private and public keys:
id_rsa.pub. Skip this step if you have already the keys in your machine!
Log on the remote host (remote machine):
$ ssh firstname.lastname@example.org $ cd ~/.ssh
.ssh does not exist create it:
Copy your public key (local machine):
$ cat ~/.ssh/id_rsa.pub | pbcopy
pbcopy will copy the content to the Clipboard.
Add the public key to the remote host (remote machine):
$ pbpaste > ~/.ssh/authorized_keys # to create the file $ chmod 644 authorized_keys
pbpaste will paste the content from the Clipboard.
Log on the remote machine without entering the password. If necessary, give the path to your private key (commonly not needed!):
$ ssh -i ~/.ssh/id_rsa email@example.com
Read more about SSH keys.
Restart/Shutdown the computer
Restart a local computer immediately:
$ sudo shutdown -r now
Restart a remote computer at a specific time:
$ ssh -l root computer shutdown -r hhmm
Shut down a remote computer in 30 minutes:
$ ssh -l root computer shutdown -h +30
A job (or process) is defined as an "instance" of an executing program.
List the jobs in your job table:
Kill job number  on the list:
$ kill %2
Display information about your processes running:
-f Displays full information. -e Displays all processes running. -u *username* Displays user processes including those from other sessions.
Kill process by PID number:
$ kill 3682
Note: if you try to logout and get the message
There are stopped jobs
list the jobs (using
jobs) and kill the stopped jobs or bring them to the foreground (see bellow), or simply type:
logout logout (yes, twice).
Run a job in the background with
$ nohup commands &
nohup enables the command to keep running after the user has logged out. The output that would normally go to the terminal goes to a file called
nohup.out if it has not already been redirected (e.g.,
Bring a background (or stopped) job to the foreground:
$ fg %jobnumber
Place a foreground job in the background (to free the terminal):
$ ^Z # type Control-z to suspend de job $ bg
Monitor (in "real time") the main processes running and information about system/hardware usage (e.g., CPU, memory, network, etc.):
Read more about job control.
File transfer via SSH
Copy file from remote host to local host:
$ scp firstname.lastname@example.org:/remote/file.ext /local/directory
Copy file from local host to remote host:
$ scp file.ext email@example.com:/remote/directory
Copy file from remote host1 to remote host2:
$ scp firstname.lastname@example.org:/remote/file.ext email@example.com:/remote/directory
Copy multiple files from local host to remote host:
$ scp file1.ext file2.ext files3.ext firstname.lastname@example.org:/remote/directory
Some scp options::
-r Recursively go through directories. -C Compress the data before it goes over the network.
-r does not know about symbolic links and will blindly follow them.
Some rsync options::
-a Archive mode, preserves file permissions and does not follow symlinks. -z Enable compression. Compress each file as it gets sent through the pipe. -e ssh Uses SSH as the transport. -v Verbose, lists files being copied.
Obs: use of trailing slashes can be confusing.
File transfer via FTP
Connect to the remote host:
$ ftp remote.host.edu
For a public FTP use
user: email@example.com password: anonymous
Change dir and see content:
$ cd dirname $ ls
Copy file from remote host to local host (initial local dir):
$ get remotefile.ext
Copy file from local host (initial local dir) to remote host:
$ put localfile.ext
Copy multiple files (to/from initial local dir):
$ mget *.ext $ mput *.ext
For not answering (Y/N) type:
Close the connection:
$ wget ftp://ftp.host.name.edu/pub/directory/ # this will generate an index.html file $ open index.html # whole directory structure listing
$ curl ftp://ftp.host.name.edu/pub/directory/
Put the trailing slashes on directories.
$ wget ftp://ftp.host.name.edu/pub/directory/*.ext
$ curl ftp://ftp.host.name.edu/pub/directory/file.ext -O # no globbing
Some wget options:
--ftp-user=user Specifies the username. --ftp-password=password Specifies the password. -r Recursive download. -m Keep a mirror of a directory (-r -N -l). -c Resume getting a partially-downloaded file. -A Comma-separated list of accepted extensions. -R Comma-separated list of rejected extensions.
Some curl options:
--user username:password Specify user and password.
This is non-recursive!
File transfer via HTTP
Download a whole website keeping the original structure (mirror):
$ wget -m http://www.website.com/
Download recursively all
.html files from a website directory:
$ wget -r -A.html http://www.website.com/directory/