Skip to content

Instantly share code, notes, and snippets.

@deadlyhifi
Forked from oodavid/README.md
Last active October 21, 2017 11:26
Show Gist options
  • Save deadlyhifi/ff1853c095cfcbd63c89b90a614c676f to your computer and use it in GitHub Desktop.
Save deadlyhifi/ff1853c095cfcbd63c89b90a614c676f to your computer and use it in GitHub Desktop.
Backup Folders and MySQL to Amazon S3

Backup Folders and MySQL to Amazon S3

This is a simple way to backup your MySQL tables and selected folders to Amazon S3 for a nightly backup - this is all to be done on your server.

Be carefull - spend some time testing it out on your setup. I’m really not to blame if you delete all the things.

Sister Document - Restore MySQL from Amazon S3 - read that next

1 - Install s3cmd

This is for Ubuntu, see http://s3tools.org/repositories for other systems like

  • Import S3tools signing key
    • wget -O- -q http://s3tools.org/repo/deb-all/stable/s3tools.key | sudo apt-key add -
  • Add the repo to sources.list
    • sudo wget -O/etc/apt/sources.list.d/s3tools.list http://s3tools.org/repo/deb-all/stable/s3tools.list
  • Refresh package cache and install the newest s3cmd
    • sudo apt-get update && sudo apt-get install s3cmd
  • Setup s3cmd s3cmd --configure
    • You’ll need to enter your AWS access key and secret key here, everything is optional and can be ignored.

2 - Add your script

Upload a copy of s3backup.sh (it will need some tweaks for your setup), make it executable and test it

# Add the executable bit
chmod +x s3backup.sh
# Run the script to make sure it's all tickety boo
./s3backup.sh
  • bucket: is the name of your bucket.
  • path: is the folder path you want. e.g. backups.
  • mysqluser: the name of your MySQL user.
  • mysqlpass: the MySQL user’s password.
  • folderBackup: Define folders to backup; the key is the name that will be part of the file placed into S3. e.g.
folderBackup[mysite-uploads]="/var/www/mysite/uploads"
folderBackup[myothersite-uploads]="/var/www/myothersite/uploads"

3 - Run it every night with CRON

Assuming the backup script is stored in /var/www/s3backup.sh (update the path if you put it somewhere else) we need to add a crontask to run it automatically:

# Edit the crontab
env EDITOR=nano crontab -e
    # Add the following lines:
    # Run the backup script at 3am
    0 3 * * * bash /var/www/s3backup.sh >/dev/null 2>&1

If you’re using Laravel Forge (or similar) you can set a scheduled job with: Command: (cd /home/forge/ && ./s3backup.sh), where /home/forge/ is the location of the script.

4 - Don't expose the script!

Do not put this in a publicly accesible folder. It should be below the web root!

5 - Set up object expiration in your S3 bucket

Otherwise the space you use will get bigger and bigger and your bill will get bigger and bigger to compensate.

http://docs.aws.amazon.com/AmazonS3/latest/user-guide/create-lifecycle.html

#!/bin/bash
# Based on https://gist.github.com/2206527
# S3 details
bucket="s3://BUCKETNAME"
path="backups"
# Database variables
mysqluser="root"
mysqlpass="PASSWORD"
# Folder backup variables
declare -A folderBackup
#folderBackup[mysite-uploads]="/var/www/mysite/public/uploads"
################################################################################
# Timestamp
stamp=`date +"%s--%d-%m-%Y--%H%M"`
# List all the databases
databases=`mysql -u $mysqluser -p$mysqlpass -e "SHOW DATABASES;" | tr -d "| " | grep -v "\(Database\|information_schema\|performance_schema\|mysql\|test\)"`
# Feedback
echo -e "Dumping to \e[1;32m$bucket/$stamp/\e[00m"
# Loop the databases ###########################################################
for db in $databases; do
# Define our filenames
filename="$stamp-$db.sql.gz"
tmpfile="/tmp/$filename"
object="$bucket/$path/$filename"
# Feedback
echo -e "\e[1;34m$db\e[00m"
# Dump and zip
echo -e "\e[0;35mCreating $tmpfile\e[00m"
mysqldump -u $mysqluser -p$mysqlpass --force --opt --databases "$db" | gzip -c > "$tmpfile"
# Upload
echo -e "\e[0;35m✩ uploading $i\e[00m"
s3cmd put "$tmpfile" "$object"
# Delete
rm -f "$tmpfile"
done;
# Loop the backup folders ######################################################
for i in "${!folderBackup[@]}"; do
# Define our filenames
filename="$stamp-$i.tar.gz"
tmpfile="/tmp/$filename"
object="$bucket/$path/$filename"
# Dump and zip
echo -e "\e[0;35mCreating $tmpfile\e[00m"
tar -cv "${folderBackup[$i]}" | gzip > "$tmpfile"
# Upload
echo -e "\e[0;35mUploading $i\e[00m"
s3cmd put "$tmpfile" "$object"
# Delete
rm -f "$tmpfile"
done;
echo -e "\e[1;32m★ Jobs a good ’un\e[00m"
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment