Skip to content

Instantly share code, notes, and snippets.

@ssimpson89
Last active May 17, 2019 13:17
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save ssimpson89/354c932a4520870b5436 to your computer and use it in GitHub Desktop.
Save ssimpson89/354c932a4520870b5436 to your computer and use it in GitHub Desktop.
serverbackup.sh
######
# This script will tar up the files you define in the rsync section and send them off to S3
# It will also delete any files older that $DAYS from the bucket
# You do need s3cmd (usually available in the repo) in order for this to work
# S3cmd will work with dreamobjects but if you want b2 or rackspace you'll need to swap out that code.
# This does also log to /var/log/backup.log and /var/log/backup.err
######
DATE=`date +%Y-%m-%d_%H-%m`
TMPBACKUP='/backup'
BACKUPFOLDER="${TMPBACKUP}/backup-${DATE}"
TARFILE="backup-${DATE}.tar.gz"
BUCKET='randomstuff'
FOLDER='Backups'
DAYS=5
#wrap it in a function to make logging easier
main() {
# ------------- BEGIN ------------ #
#Prepwork
mkdir -p ${BACKUPFOLDER}
#Sync all the files over first
rsync -aR /usr/local/nginx/ ${BACKUPFOLDER}
rsync -aR /usr/local/php/ ${BACKUPFOLDER}
rsync -aR --exclude=root/tmp/ /root ${BACKUPFOLDER}
mysqldump --all-databases > ${BACKUPFOLDER}/alldbs.sql
#Then tar it up!
tar -zcf ${TMPBACKUP}/${TARFILE} ${BACKUPFOLDER}
#Upload it to s3/dreamhost
s3cmd --no-progress put ${TMPBACKUP}/${TARFILE} s3://${BUCKET}/${FOLDER}/
#Start a loop to look for backups older than $DAYS
for backup in $(s3cmd ls s3://${BUCKET}/${FOLDER}/ | awk '{ print $4 }' | grep tar)
do
if [ $(date -d "$(echo ${backup} | sed -e 's/\(.*backup-\)\(.*\)\(_.*\)/\2/g')" +%m%d%Y) -lt $(date -d "${DAYS} days ago" +%m%d%Y) ]
then
echo "Deleting ${backup}"
s3cmd del ${backup}
fi
done
#Cleanup!
rm -f ${TMPBACKUP}/${TARFILE}
rm -rf ${BACKUPFOLDER}
# ------------- END ------------ #
}
main 1>/var/log/backup.log 2>/var/log/backup.err
@OliverBailey
Copy link

Out of curiosity, why not set a retention policy through S3 itself to remove files after X days?

https://aws.amazon.com/blogs/aws/amazon-s3-object-expiration/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment