Skip to content

Instantly share code, notes, and snippets.

@jrwren
Created October 21, 2013 16:17
Show Gist options
  • Save jrwren/7086567 to your computer and use it in GitHub Desktop.
Save jrwren/7086567 to your computer and use it in GitHub Desktop.
debian python-boto package doesn't ship the "bins" so fetch_file and s3put are not there. Write quick and dirty versions of them. Then fetch the last database from s3 and restore it. add an hourly cron job to dump the database and put to s3.
#cloud-config
runcmd:
- |
echo "#!/usr/bin/python
import boto, sys
boto.connect_s3().get_bucket(sys.argv[0]).get_key(sys.argv[1]).get_contents_to_filename(sys.argv[2])
">/usr/bin/s3get
- chmod +x /usr/bin/s3get
- s3get bucket db_dumps/$HOSTNAME-dbname.pg_dump /var/tmp/dbname.pg_dump
- su -c 'pg_restore -d dbname /var/tmp/dbname.pg_dump' postgres
- |
echo "#!/usr/bin/python
import boto, sys
boto.connect_s3().get_bucket(sys.argv[0]).new_key(sys.argv[2]).set_contents_from_filename(sys.argv[1])
">/usr/bin/s3put
- chmod +x /usr/bin/s3put
- echo '0 * * * * postgres (cd /var/lib/postgresql; pg_dump -Fc dbname > $HOSTNAME-dbname.pg_dump; s3put bucket $HOSTNAME-dbname.pg_dump db_dumps/$HOSTNAME-dbname.pg_dump' > /etc/cron.d/db_dump
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment