Skip to content

Instantly share code, notes, and snippets.

@arturoleon
Created May 29, 2013 04:33
Show Gist options
  • Save arturoleon/5667978 to your computer and use it in GitHub Desktop.
Save arturoleon/5667978 to your computer and use it in GitHub Desktop.
Script to copy incremental daily cpanel backups to a S3 bucket.
require 'rubygems'
require 'aws-sdk'
backup_dir = "/backup/cpbackup/" #end with /
users = "appspinc,arturo" #cpanel accounts to backup
users = users.split(",")
bucket = 'BUCKET_NAME'
#S3 login details
s3 = AWS::S3.new(
:access_key_id => 'KEY_ID',
:secret_access_key => 'ACCESS_KEY')
bucket = s3.buckets[bucket]
#daily backups
users.each do |i|
i.strip! #remove spaces
dir = "#{backup_dir}daily/#{i}"
if(File.directory?(dir)) #if directory exists
time = Time.new
filename = "daily_#{i}_#{time.strftime("%Y%m%d")}.tar.gz"
puts "Starting backup for #{i}..."
puts "Creating #{filename}"
`tar -cvzf #{filename} #{dir}` #create the file running unix cmd
puts "Sending file to S3..."
bucket.objects[filename].write(:file => File.open(filename),:reduced_redundancy => true)
puts "Removing #{filename}\n\n"
File.delete(filename)
else
puts "WARNING: Directory #{dir} does not exist. Skipping backup for #{i}.\n\n"
end
end
puts "Backup finished"
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment