Instantly share code, notes, and snippets.

What would you like to do?
A small script to backup files and directories to s3 using s3cmd
# you can drop this in cron.hourly or cron.daily, or run it manually.
## relies on an installed and properly configured s3cmd.
## install python-magic to keep s3cmd from throwing warnings.
# set your bucket name here, it can be global for your org as we store any data
# under the hostname.
# note: bucket names are global, so you need to pick something nobody else already picked.
# protip: bucketname is already taken by somebody. :)
# add files to backup here, no trailing slash,
declare -a files=()
# example:
# declare -a files={'/some/file/path' '/another/file/path/' 'yet/one/more/file/path' \
# '/use/back/slahes/to/wrap/to/a/new/line' '/leave/a/space/between/entries' \
# '/single/or/double/quotes/around/entries' )
# add directories to backup here, no trailing slash.
declare -a dirs=('/etc')
# verify that our bucket exists, if not, make it.
if [ `s3cmd ls | grep -c ${bucket}\$` -eq 0 ]; then
s3cmd mb s3://${bucket} >/dev/null
for directory in "${dirs[@]}"; do
s3cmd sync "${directory}/" "s3://${bucket}/${hostname}${directory}/" >/dev/null
for file in "${files[@]}"; do
s3cmd cp "${file}" "s3://${bucket}/${hostname}/${file}" >/dev/null
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment