Skip to content

Instantly share code, notes, and snippets.

@mikeadmire
Last active December 29, 2015 03:29
Show Gist options
  • Save mikeadmire/7607879 to your computer and use it in GitHub Desktop.
Save mikeadmire/7607879 to your computer and use it in GitHub Desktop.
Dump tables from MySQL database and keep a running archive on S3.
#!/usr/bin/env ruby
require 'fileutils'
#----- Modify this block for environment -----#
# Set mysql_params to the username and password params for the environment.
# mysql_params = "-u root -psecret"
mysql_params = "-u root"
# Length of time in days to keep files
archive_size = 10
# Set to the location on the server that will be synced to S3
local_location = "/var/lib/mysql_table_backups/"
# Set to the path on S3 that you want to sync to
s3_location = "/mysql_table_backups/"
#---------------------------------------------#
#----- You should NOT need to change anything below this -----#
# Get DB name from CLI
raise "Must pass database name as argument" if ARGV[0].nil?
database = ARGV[0]
# create directory to store tables in
timestamp = Time.now.strftime("%Y_%m_%d")
Dir.chdir(local_location)
directory_name = "#{database}_#{timestamp}"
Dir.mkdir(directory_name)
Dir.chdir(directory_name)
# Read lock all tables in DB
`mysql -e 'FLUSH TABLES WITH READ LOCK;' #{mysql_params}`
# get all tables from the specified DB
results = `mysql -e 'SHOW tables FROM #{database};' #{mysql_params}`.split("\n")
results.shift # remove the header from SHOW TABLES command
# dump each table and keep track of how long it takes
results.each do |table|
`mysqldump #{mysql_params} #{database} #{table} > #{table}.sql`
end
# remove read lock
`mysql -e 'UNLOCK TABLES;' #{mysql_params}`
# compress directory of sql dumps
Dir.chdir(local_location)
`/usr/bin/tar cvzf #{directory_name}.tgz #{directory_name}`
# Clean up dump directory
FileUtils.rm_rf(directory_name)
# Delete old files from archive
oldest_timestamp = (Time.now - (archive_size * 86400)).strftime("%Y_%m_%d") # Calculate time of oldest file
oldest_filename = "#{database}_#{oldest_timestamp}.tgz"
Dir.chdir(local_location)
Dir.glob("#{database}_*.tgz").each do |file|
if file < oldest_filename
File.delete(file)
end
end
# sync to S3
`/usr/bin/s3cmd sync --config=/root/.s3cfg --skip-existing --delete-removed --no-preserve --no-progress #{local_location}* s3://#{s3_location} > /dev/null`
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment