Skip to content

Instantly share code, notes, and snippets.

@raarellano
Last active October 10, 2016 20:58
Show Gist options
  • Save raarellano/3c0397c1fd07268c2aca to your computer and use it in GitHub Desktop.
Save raarellano/3c0397c1fd07268c2aca to your computer and use it in GitHub Desktop.
Setup automated backup scripts

Setup folder structure

$ sudo su postgres

$ cd ~

$ mkdir -p backups/{daily,weekly,monthly,yearly} scripts

Install and Configure AWS

Install python and pip

Exit to a sudo user

$ exit

check python is installed (install pyton - http://docs.aws.amazon.com/cli/latest/userguide/installing.html#install-python)

$ python --version

check pip is installed

$ pip --help

install pip

curl -O https://bootstrap.pypa.io/get-pip.py

sudo python get-pip.py

Install AWS CLI

$ sudo pip install awscli

Configure AWS CLI

$ sudo su - postgres

$ aws configure

$ aws s3 ls

Setup S3 structure

login to AWS

manage s3

create bucket

create folder backups

create folders production and staging inside backups

Create backup scripts

$ cd ~/scripts

$ nano daily_backup.sh (copy, modify above scripts)

$ nano weekly_backup.sh (copy, modify above scripts)

$ nano monthly_backup.sh (copy, modify above scripts)

$ nano yearly_backup.sh (copy, modify above scripts)

Make sure script is executable by postgres

chmod 770 daily_backup.sh weekly_backup.sh monthly_backup.sh yearly_backup.sh

Schedule the script using cron

$ crontab -e

(Use above cron script)

Test crontab

$ crontab -e

change the daily backup to every minute

  * *  *   *   *     ~/scripts/daily_backup.sh > ~/scripts/daily_backup.log 2>&1

check the daily_backup.log for any errors

#Check S3 Bucket login to AWS and verify the S3 bucket has the uploaded files

Reference

http://www.rubytreesoftware.com/resources/basic-postgresql-backup-and-restore

#run crontab -e (as postgres user)
# m h dom mon dow command
0 4 * * * ~/scripts/daily_backup.sh > ~/scripts/daily_backup.log 2>&1
0 4 * * 0 ~/scripts/weekly_backup.sh > ~/scripts/weekly_backup.log 2>&1
0 4 1 * * ~/scripts/monthly_backup.sh > ~/scripts/monthly_backup.log 2>&1
0 4 1 1 * ~/scripts/yearly_backup.sh > ~/scripts/yearly_backup.log 2>&1
#!/bin/bash
# DAILY BACKUP
# Take a full backup of the database and store it in the
# local backup folder + S3 Storage
# Run this script as the postgres user
#
DATE=`date +%Y-%m-%d`
echo `date` - Delete old backups
find ~/backups/daily/ -mtime +6 -delete
echo `date` - Do a full postgres cluster dump
pg_dumpall | gzip > ~/backups/daily/db_cluster_dump_$DATE.gz
echo `date` - Sync pg_backups with S3
/usr/local/bin/aws s3 sync ~/backups s3://app/backups/production/clusters --delete
echo `date` - Sync postgres configuration files with S3
/usr/local/bin/aws s3 sync /etc/postgresql/9.3/main s3://app/backups/production/config/main
echo `date` - Backup complete
#!/bin/bash
# MONTHLY BACKUP
# Take a full backup of the database and store it in the
# local backup folder + S3 Storage
# Run this script as the postgres user
#
DATE=`date +%Y-%m-%d`
echo `date` - Delete old backups
find ~/backups/monthly/ -mtime +365 -delete
echo `date` - Do a full postgres cluster dump
pg_dumpall | gzip > ~/backups/monthly/db_cluster_dump_$DATE.gz
echo `date` - Sync pg_backups with S3
/usr/local/bin/aws s3 sync ~/backups s3://app/backups/production/clusters --delete
echo `date` - Backup complete
#!/bin/bash
# Weekly BACKUP
# Take a full backup of the database and store it in the
# local backup folder + S3 Storage
# Run this script as the postgres user
#
DATE=`date +%Y-%m-%d`
echo `date` - Delete old backups
find ~/backups/weekly/ -mtime +56 -delete
echo `date` - Do a full postgres cluster dump
pg_dumpall | gzip > ~/backups/weekly/db_cluster_dump_$DATE.gz
echo `date` - Sync pg_backups with S3
/usr/local/bin/aws s3 sync ~/backups s3://app/backups/production/clusters --delete
echo `date` - Backup complete
#!/bin/bash
# YEARLY BACKUP
# Take a full backup of the database and store it in the
# local backup folder + S3 Storage
# Run this script as the postgres user
#
DATE=`date +%Y-%m-%d`
echo `date` - Do a full postgres cluster dump
pg_dumpall | gzip > ~/backups/yearly/db_cluster_dump_$DATE.gz
echo `date` - Sync pg_backups with S3
/usr/local/bin/aws s3 sync ~/backups s3://app/backups/production/clusters
echo `date` - Backup complete
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment