Skip to content

Instantly share code, notes, and snippets.

@matthbull
Created January 6, 2016 10:17
Show Gist options
  • Save matthbull/12ea61c4da1cee30b886 to your computer and use it in GitHub Desktop.
Save matthbull/12ea61c4da1cee30b886 to your computer and use it in GitHub Desktop.
Tar and sync compressed files to an Amazon Web Services S3 bucket.
#!/bin/sh
# Author: Matt Holbrook-Bull
# Description: Tar and upload files to an AWS S3 bucket. Also backs up apache sites-available config files (take this out if you dont need it)
# Setup: you need to install the AWS CLI - http://docs.aws.amazon.com/cli/latest/userguide/cli-chap-getting-set-up.html
# To run this automatically setup a root crontab job
# Will remove backups over 5 days old. Edit line 31/32 to adjust this retention period.
# alter these prefs to specify your S3 bucket location, local storage path on your server, and the source path of your required backups
DATE=`date +%d%m%y%H%M`
S3PATH=XXXXXX
BKPATH=XXXXX
SOURCEPATH=/var/www/vhosts
if [ $1 ];then
SITENAME=$1
fi
echo ${DATE}
echo "tar'ing folders..."
for path in ${SOURCEPATH}/*; do
[ -d "${path}" ] || continue # if not a directory skip
tar -zcf ${BKPATH}/sitebackup_$(basename ${path} )_${DATE}.tar.gz -C / ${path}
done
tar -zcf ${BKPATH}/sites-available_${DATE}.tar.gz -C / /etc/apache2/sites-available
echo "removing old backups..."
# remove old backups
find ${BKPATH}/site* -mtime +5 -exec rm {} \;
find ${BKPATH}/confs/* -mtime +5 -exec rm {} \;
echo "syncing to amazon..."
# sync to amazon
/usr/local/bin/aws s3 sync ${BKPATH} s3://${S3PATH} --delete
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment