Skip to content

Instantly share code, notes, and snippets.

@saniaky
Last active April 9, 2019 07:05
Show Gist options
  • Save saniaky/a8cebfa41f0ecc2cc51fbc3ae57b634e to your computer and use it in GitHub Desktop.
Save saniaky/a8cebfa41f0ecc2cc51fbc3ae57b634e to your computer and use it in GitHub Desktop.
Automatic backup of all MySQL databases to AWS S3 + restore script that downloads latest backup and imports it
  1. Create S3 bucket db-backup
  2. Specify Expire Policty to remove old backups
    Open Bucket Settings -> Management -> Add lifecycle rule -> Expiration tab
    Select Expire current version of object. Use 30 days.
    Select Permanently delete previous versions. Use 1 day.
  3. Create user db-backup-user, assign inline permission policy:
    {
        "Version": "2012-10-17",
        "Statement": [
            {
                "Sid": "AllowAll",
                "Effect": "Allow",
                "Action": "s3:*",
                "Resource": [
                    "arn:aws:s3:::db-backup",
                    "arn:aws:s3:::*/*"
                ]
            }
        ]
    }
    
  4. On server login to aws s3 using generated key.
  5. Copy files backup.sh and restore.sh
  6. Add backup.sh to cron job.
#!/usr/bin/env bash
BUCKET=db-backup
CONTAINER=db
DATE=`date '+%Y-%m-%d_%H.%M.%S'`
FILE=./db_backups/db-${DATE}.sql.gz
# Create backup
echo "Making backup to: $FILE"
docker exec $CONTAINER sh -c \
'mysqldump --all-databases --quick --single-transaction --skip-lock-tables --flush-privileges -uroot -p"$MYSQL_ROOT_PASSWORD"' \
| gzip > $FILE
# Upload to AWS S3
echo "Uploading $FILE file to S3 Bucket $BUCKET"
aws s3 cp $FILE s3://$BUCKET
# Add cron job
# bash> crontab -e
# 0 0 * * * /bin/sh /path/to/backup.sh
#!/usr/bin/env bash
BUCKET=db-backup
CONTAINER=db
FOLDER=./db_backups
KEY=$1
# Get latest backup from S3
if [[ -z "$KEY" ]]; then
echo "Getting latest backup version..."
KEY=`aws s3 ls s3://$BUCKET --recursive | sort | tail -n 1 | awk '{print $4}'`
echo "Downloading latest backup - $KEY"
aws s3 cp s3://$BUCKET/$KEY $FOLDER/$KEY
fi
# Unzip file
echo "Unzipping file $KEY"
gunzip $FOLDER/$KEY
FILE="$FOLDER/${KEY%.gz}"
# Import databases
echo "Importing file $FILE"
cat $FILE | docker exec -i $CONTAINER sh -c 'mysql -uroot -p"$MYSQL_ROOT_PASSWORD"'
echo "Removing $FILE"
rm -f $FILE
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment