Skip to content

Instantly share code, notes, and snippets.

@mrunkel
Last active April 2, 2022 16:04
Show Gist options
  • Save mrunkel/dcdfe1ca68ae595864a8d280260d25af to your computer and use it in GitHub Desktop.
Save mrunkel/dcdfe1ca68ae595864a8d280260d25af to your computer and use it in GitHub Desktop.
A small script to backup files and directories to s3 using s3cmd
#!/bin/bash
# you can drop this in cron.hourly or cron.daily, or run it manually.
## relies on an installed and properly configured s3cmd.
## install python-magic to keep s3cmd from throwing warnings.
# set your bucket name here, it can be global for your org as we store any data
# under the hostname.
# note: bucket names are global, so you need to pick something nobody else already picked.
# protip: bucketname is already taken by somebody. :)
bucket="bucketname"
hostname=`hostname`
# add files to backup here, no trailing slash,
declare -a files=()
# example:
# declare -a files={'/some/file/path' '/another/file/path/' 'yet/one/more/file/path' \
# '/use/back/slahes/to/wrap/to/a/new/line' '/leave/a/space/between/entries' \
# '/single/or/double/quotes/around/entries' )
# add directories to backup here, no trailing slash.
declare -a dirs=('/etc')
# verify that our bucket exists, if not, make it.
if [ `s3cmd ls | grep -c ${bucket}\$` -eq 0 ]; then
s3cmd mb s3://${bucket} >/dev/null
fi
for directory in "${dirs[@]}"; do
s3cmd sync "${directory}/" "s3://${bucket}/${hostname}${directory}/" >/dev/null
done
for file in "${files[@]}"; do
s3cmd cp "${file}" "s3://${bucket}/${hostname}/${file}" >/dev/null
done
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment