Skip to content

Instantly share code, notes, and snippets.

@cam8001
Last active February 17, 2017 16:24
Show Gist options
  • Save cam8001/5717464 to your computer and use it in GitHub Desktop.
Save cam8001/5717464 to your computer and use it in GitHub Desktop.
#!/bin/bash
TMP="/tmp"
DB_USER="root"
DB_PASSWD="password"
SITES_DIR="/var/www"
S3_BUCKET="s3://bucket-name"
DATE=$(date +%Y-%m-%d)
# Backup all databases to S3
for DB in $(mysql --user=$DB_USER --password=$DB_PASSWD -e 'show databases' -s --skip-column-names|grep -Ev "^(information_schema|performance_schema|mysql)$");
do
# First dump the structures
TABLES=`mysql --skip-column-names -e 'show tables' --user=${DB_USER} --password=${DB_PASSWD} ${DB}`
mysqldump --complete-insert --disable-keys --single-transaction --no-data --user=$DB_USER --password=$DB_PASSWD --opt $DB $TABLES > $TMP/$DB-$DATE
# Then dump the data, except for cache and temporary tables.
TABLES2=`echo "$TABLES" | grep -Ev "^(accesslog|cache_.*|flood|search_.*|semaphore|sessions|watchdog)$"`
mysqldump --complete-insert --disable-keys --single-transaction --no-create-info --user=$DB_USER --password=$DB_PASSWD $DB $TABLES2 >> $TMP/$DB-$DATE
# Gzip everything
gzip -v $TMP/$DB-$DATE;
# Upload to Amazon S3
s3cmd put $TMP/$DB-$DATE.gz $S3_BUCKET/databases/$DB-$DATE.gz;
# Cleanup
rm $TMP/$DB-$DATE.gz;
done
# Backup all sites to S3
cd $SITES_DIR;
for DIR in $(find "$SITES_DIR" -mindepth 1 -maxdepth 1 -type d);
do
# Tar and Gzip each directory
BASE=$(basename "$DIR");
tar -czf $TMP/$BASE.tar.gz $BASE;
# Upload to Amazon S3
s3cmd put $TMP/$BASE.tar.gz $S3_BUCKET/sites/$BASE-$DATE.tar.gz;
# Cleanup
rm $TMP/$BASE.tar.gz;
done
@iamkingsleyf
Copy link

Can this work for wp? my sql back and site folders are in the same directory

@tarto-dev
Copy link

@iamkingsleyf It would, you could just remove L16, that's only usefull on Drupal, useless with Wordpress or Joomla :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment