Skip to content

Instantly share code, notes, and snippets.

Embed
What would you like to do?
Backup Individual MySQL Databases and Upload the same to Amazon S3. Please ensure Amazon AWS CLI tools are already installed and configured on the server.
#!/bin/bash
PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin
# Temporary Path where the archive will be created and stored before uploading to S3
# Example: BACKUPPATH=/mnt
BACKUPPATH=/
# Amazon S3 Bucket name
# Example: S3BUCKET=production-database-backups
S3BUCKET=
# Individual Database names separated by spaces.
# Prefix underscore(_) with a backslash(\) in directory name.
# Example: DATABASES=(db1 db2 db3 db\_4)
DATABASES=()
# Individual Amazon S3 directories in the bucket separated by spaces.
# Prefix underscore(_) with a backslash(\) in directory name.
# Example: DIRECTORIES=(db1 db2 db3 db\_4)
DIRECTORIES=()
# MySQL Hostname or IP Address
MYSQL_HOSTNAME=127.0.0.1
# MySQL Username that has access to all the above databases
MYSQL_USERNAME=root
# MySQL Password for the above user
MYSQL_PASSWORD=
########################### END CONFIGURATION. EDIT BELOW AT YOUR OWN RISK ###########################
INDEX=0
for database in ${DATABASES[@]}
do
FILENAME=$BACKUPPATH/$database\_`date '+%m-%d-%Y_%I-%p'`.sql.gz
echo "==================================================================="
echo "Backing up $database Database"
sh -c "mysqldump --extended-insert -Q --opt -h $MYSQL_HOSTNAME -u$MYSQL_USERNAME -p$MYSQL_PASSWORD --databases $database | gzip > $FILENAME" 2>/dev/null
echo "Uploading Archive to Amazon S3"
sh -c "aws s3 cp $FILENAME s3://$S3BUCKET/${DIRECTORIES[$index]}/" > /dev/null
echo "Cleaning Up..."
rm -f $FILENAME
index=`expr $index + 1`
done
echo "==================================================================="
echo " All Databases Backup Complete"
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment