Skip to content

Instantly share code, notes, and snippets.

@ruckus
Created January 9, 2015 03:58
Show Gist options
  • Save ruckus/d30531c543d677eb3acb to your computer and use it in GitHub Desktop.
Save ruckus/d30531c543d677eb3acb to your computer and use it in GitHub Desktop.
Ruby script to download log files from RDS and archive to S3.
=begin
This script uses the AWS v2 API
Gemfile:
source "https://rubygems.org"
gem "aws-sdk", "2.0.17.pre"
=end
#!/usr/bin/ruby
require 'rubygems'
require 'aws-sdk'
require 'zlib'
# RDS IAM role credentials
rds_key = ''
rds_secret = ''
rds_credentials = Aws::Credentials.new(rds_key, rds_secret)
# S3 IAM role credentials
s3_key = ''
s3_secret = ''
s3_credentials = Aws::Credentials.new(s3_key, s3_secret)
db = 'db-instance-identifier'
bucket = 's3-destination-bucket'
month = Time.now.strftime('%Y-%m-%d')
timestamp = (Time.now.utc - 3600).strftime('%Y-%m-%d-%H')
rds = Aws::RDS::Client.new(region: 'us-west-1', credentials: rds_credentials)
s3 = Aws::S3::Client.new(region: 'us-east-1', credentials: s3_credentials)
rds_log_file = "error/postgresql.log.#{timestamp}"
out_log_file = "postgresql.log.#{timestamp}"
puts "downloading: #{rds_log_file}"
puts "destination: #{out_log_file}"
opts = {
db_instance_identifier: db,
log_file_name: rds_log_file,
number_of_lines: 60000,
marker: "0"
}
additional_data_pending = true
File.open(out_log_file, "wb+") do |file|
while additional_data_pending do
out = rds.download_db_log_file_portion(opts)
file.write(out[:log_file_data])
#puts out[:marker]
opts[:marker] = out[:marker]
additional_data_pending = out[:additional_data_pending]
end
end
if File.exists?(out_log_file)
# compress it
compressed_log_file = "#{out_log_file}.gz"
puts "uploading to s3: #{compressed_log_file}"
Zlib::GzipWriter.open(compressed_log_file) do |gz|
File.open(out_log_file) do |fp|
while chunk = fp.read(16 * 1024) do
gz.write(chunk)
end
end
gz.close
end
if File.exists?(compressed_log_file)
fp = nil
begin
fp = File.open(compressed_log_file)
upload_opts = {
acl: "private",
bucket: bucket,
key: "#{month}/#{compressed_log_file}",
body: fp
}
s3.put_object(upload_opts)
puts "done"
ensure
fp.close if fp.is_a?(File)
end
end
File.unlink(out_log_file)
File.unlink(compressed_log_file)
end
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment