Skip to content

Instantly share code, notes, and snippets.

@mattboldt
Last active May 5, 2018 17:06
Show Gist options
  • Star 1 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save mattboldt/1d67be94b8b51ddb7a7ae07545ac997f to your computer and use it in GitHub Desktop.
Save mattboldt/1d67be94b8b51ddb7a7ae07545ac997f to your computer and use it in GitHub Desktop.
Berkshelf packaging your cookbooks and uploading them to S3
# frozen_string_literal: true
source "https://rubygems.org"
gem 'aws-sdk', '~> 2'
gem 'colorize'
#!/usr/bin/env ruby
require 'aws-sdk'
require 'colorize'
AWS_ACCESS_KEY = ''
AWS_SECRET_ACCESS_KEY = ''
COOKBOOK_BUCKET = ''
COOKBOOK_ARCHIVE = 'cookbooks.tar.gz'
Aws.config.update({
region: '<your AWS region>',
credentials: Aws::Credentials.new(AWS_ACCESS_KEY, AWS_SECRET_ACCESS_KEY)
})
s3 = Aws::S3::Resource.new
bucket = s3.bucket(COOKBOOK_BUCKET)
obj = bucket.object(COOKBOOK_ARCHIVE)
if obj.exists?
# Rename existing cookbook w/ timestamp
new_key = "#{obj.last_modified} - #{obj.key}"
puts "Renaming remote #{COOKBOOK_ARCHIVE} to #{new_key}".green
obj.move_to(bucket: COOKBOOK_BUCKET, key: new_key)
else
puts "No existing #{COOKBOOK_ARCHIVE} found".red
end
# Generate new berks package zip
package_path_str = `berks package #{COOKBOOK_ARCHIVE}`
puts "Packaged #{package_path_str}".green
file = File.new(COOKBOOK_ARCHIVE)
# Upload new package with cookbook archive name
puts "Uploading #{COOKBOOK_ARCHIVE} to S3 bucket #{COOKBOOK_BUCKET}".green
bucket.put_object(body: file, key: COOKBOOK_ARCHIVE)
puts "Success!".blue.underline
# Remove local copy
File.delete(COOKBOOK_ARCHIVE)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment