Skip to content

Instantly share code, notes, and snippets.

@mtigas
Created February 9, 2010 22:56
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save mtigas/299794 to your computer and use it in GitHub Desktop.
Save mtigas/299794 to your computer and use it in GitHub Desktop.
Script that creates a tarball backup of a directory, compressed with xz (LZMA2) and encrypted with GPG. Uploads backup to S3 storage space.
#!/bin/bash
# Script that creates a tarball backup of a directory,
# compressed with xz (LZMA2) and encrypted with GPG.
# Resulting backup is then uploaded onto my Amazon S3
# storage space.
#
# Requires:
# * GNU Tar 1.22
# * XZ Utils 4.999.9beta
# * GNU Privacy Guard
# * Python 2.5+
# * python-boto
#
# See below for the gpgxz.sh and s3up.py helper scripts.
export BACKUPDATE=`date +"%Y%m%d-%H%M"`
export BACKUP_FILE=/tmp/$1-$BACKUPDATE.tar.xz.gpg
export BACKUP_FILE_BASENAME=$1-$BACKUPDATE.tar.xz.gpg
echo
echo "Removing existing backup file (if applicable)"
rm $BACKUP_FILE
echo
echo
tar -cf $BACKUP_FILE -Igpgxz.sh $1 --exclude-caches-all --exclude="*.pyo" --exclude="*.pyc"
echo
echo "Uploading backup to S3 store"
s3up.py $BACKUP_FILE miketigas-backup `date -u +"%Y%m%d-%H"`-UTC/$BACKUP_FILE_BASENAME 0 "private"
echo
echo "Removing archive from temp dir"
rm $BACKUP_FILE
#!/bin/bash
# Wraps a data stream through xz (LZMA2) compression and then
# through GPG encryption. Intended for use in GNU tar's
# --use-compress-program option.
#
# Feel free to adjust the xz settings (currently uses high
# compression + "extra CPU use" but with memory limiting)
# depending on the system this script is deployed on.
#
# Obviously you will want to change the recipient of the
# gpg encryption. (And you'll need their public key.)
case $1 in
-d) gpg --decrypt - | xz -dvc ;;
'') xz -zvc -9e --memory=150MiB | gpg -er 5413B5A3 ;;
*) echo "Unknown option $1">&2; exit 1;;
esac
#!/usr/bin/python
import sys
import traceback
from mimetypes import guess_type
from datetime import datetime,timedelta
from urllib import urlencode
from boto.s3 import Connection as S3Connection
from socket import setdefaulttimeout
setdefaulttimeout(100.0)
AWS_ACCESS_KEY_ID = ''
AWS_SECRET_ACCESS_KEY = ''
def upload_file(local_file,bucket,remote_path,cache_time=43200,policy="public-read"):
cache_time = int(cache_time)
now = datetime.now()
expire_dt = now + timedelta(seconds=cache_time*1.5)
s3 = S3Connection(aws_access_key_id=AWS_ACCESS_KEY_ID,aws_secret_access_key=AWS_SECRET_ACCESS_KEY)
bucket = s3.get_bucket(bucket)
key = bucket.new_key(remote_path)
key.content_type = guess_type(local_file,False)
if cache_time != 0:
key.set_metadata('Cache-Control','max-age=%d, must-revalidate' % int(cache_time))
key.set_metadata('Expires', expire_dt.strftime("%a, %d %b %Y %H:%M:%S GMT"))
key.set_contents_from_filename(local_file,policy=policy)
if policy is "public-read":
key.make_public()
def main(args):
if len(args) == 5:
upload_file(args[0],args[1],args[2],args[3],args[4])
elif len(args) == 4:
upload_file(args[0],args[1],args[2],args[3])
elif len(args) == 3:
upload_file(args[0],args[1],args[2])
else:
print "Usage:"
print "s3up filename bucket remote_filename [cache_time] [policy]"
if __name__ == '__main__':
try:
main(sys.argv[1:])
except Exception, e:
sys.stderr.write('\n')
traceback.print_exc(file=sys.stderr)
sys.stderr.write('\n')
print sys.argv[1:]
sys.exit(1)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment