Skip to content

Instantly share code, notes, and snippets.

@mikhail
Last active November 24, 2015 18:58
Show Gist options
  • Star 3 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save mikhail/93f72d79323d2381d623 to your computer and use it in GitHub Desktop.
Save mikhail/93f72d79323d2381d623 to your computer and use it in GitHub Desktop.
Bash scripts for backing up mysql to s3 and restoring specific tables from that backup.

Backup & Restore MySQL <-> AWS S3

Based on https://gist.github.com/2206527 this script loops through databases and tables and creates individual files that are uploaded to S3.

#!/bin/bash
# Based on https://gist.github.com/2206527
# Be pretty
echo -e " "
echo -e " . ____ . ______________________________"
echo -e " |/ \| | |"
echo -e "[| \e[1;31m♥ ♥\e[00m |] | S3 MySQL Backup Script v.0.1 |"
echo -e " |___==___| / © oodavid 2012 |"
echo -e " |______________________________|"
echo -e " "
# Basic variables
echo -n "MySQL root password: "
read -s mysqlpass
echo ''
bucket="s3://yourbucket/mysql/backup"
# Timestamp (sortable AND readable)
stamp=`date +"%s_-_%A_%d_%B_%Y"`
# List all the databases
databases=`mysql -u root -p$mysqlpass -e "SHOW DATABASES;" | tr -d "| " | grep -v "\(Database\|information_schema\|performance_schema\|mysql\|test\)"`
# Feedback
echo -e "Dumping to \e[1;32m$bucket/$stamp/\e[00m"
# Loop the databases
for db in $databases; do
# Feedback
echo -e "\e[1;34m$db\e[00m"
# List all the databases
tables=`mysql $db -u root -p$mysqlpass -e "SHOW TABLES;" | tr -d "| "`
for tb in $tables; do
# Define our filenames
filename="$db.$tb.sql.gz"
tmpfile="/tmp/$filename"
object="$bucket/$stamp/$filename"
# Dump and zip
echo -e " creating \e[0;35m$tmpfile\e[00m"
mysqldump -u root -p$mysqlpass --force --opt --databases "$db" --table "$tb" | gzip -c > "$tmpfile"
# Upload
echo -e " uploading..."
s3cmd put "$tmpfile" "$object"
# Delete
rm -f "$tmpfile"
done;
done;
# Jobs a goodun
echo -e "\e[1;32mJobs a goodun\e[00m"
#!/bin/bash
if [ $# -eq 0 ]; then
echo "Please specify the table you want to download"
echo "Usage:"
echo " $0 <database>.<table_name>"
echo ""
echo "Example:"
echo " $0 yourdb_site.users"
exit 1
fi
bucket='yourbucket/mysql/backup/'
# Grep: for directories only
# sort: by number (epoch is the first part of the directory name
# tail -n1: grab the last one (most recent)
# awk: print just the bucket name
latest_backup=$(s3cmd ls s3://$bucket | grep DIR | sort -n | tail -n1 | awk -"Fs3://" '{print $2}' )
echo "Found latest backup bucket: $latest_backup"
echo "Searching for $1..."
all_tables=$(s3cmd --include="$1" ls "s3://$latest_backup*")
table_archive=$(echo "$all_tables" | grep "$1.sql.gz" | awk -"Fs3://" '{print $2}')
echo "Found archive for $1: $table_archive"
if [ -z "$table_archive" ]; then
echo "Could not find an archive for $1"
exit 2
fi
tmpfile=/tmp/$1.sql
echo "Downloading into $tmpfile.gz"
s3cmd get "s3://$table_archive" "$tmpfile.gz"
gunzip $tmpfile.gz
echo "Done!"
echo "Run this command to import the data:"
echo "mysql your_database -uyour_user -p < $tmpfile"
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment