Skip to content

Instantly share code, notes, and snippets.

@matiaskorhonen
Last active December 15, 2015 04:39
Show Gist options
  • Star 4 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save matiaskorhonen/5203327 to your computer and use it in GitHub Desktop.
Save matiaskorhonen/5203327 to your computer and use it in GitHub Desktop.
Speed up travis build by caching the bundle to S3. Full explanation and instructions: http://randomerrata.com/post/45827813818/travis-s3
bundler_args: --without development --path=~/.bundle
language: ruby
rvm:
- 1.9.3
env:
global:
- BUNDLE_ARCHIVE="your-bundle-name"
- AWS_S3_REGION="us-east-1"
- AWS_S3_BUCKET="your-bucket-name"
- RAILS_ENV=test
- secure: "A_VERY_LONG_SERIES_OF_CHARACTERS_HERE"
# Ensure Bundler >= 1.1, don't install rdocs, and fetch a cached bundle from S3
before_install:
- "echo 'gem: --no-ri --no-rdoc' > ~/.gemrc"
- gem install bundler fog
- "./script/travis/bundle_install.sh"
before_script:
- "cp config/database.example.yml config/database.yml"
after_script:
- "ruby script/travis/bundle_cache.rb"
script: "bundle exec rake db:create db:test:load spec"
# encoding: UTF-8
require "digest"
require "fog"
bucket_name = ENV["AWS_S3_BUCKET"]
architecture = `uname -m`.strip
file_name = "#{ENV['BUNDLE_ARCHIVE']}-#{architecture}.tgz"
file_path = File.expand_path("~/#{file_name}")
lock_file = File.join(File.expand_path(ENV["TRAVIS_BUILD_DIR"]), "Gemfile.lock")
digest_filename = "#{file_name}.sha2"
old_digest = File.expand_path("~/remote_#{digest_filename}")
puts "Checking for changes"
bundle_digest = Digest::SHA2.file(lock_file).hexdigest
old_digest = File.exists?(old_digest) ? File.read(old_digest) : ""
if bundle_digest == old_digest
puts "=> There were no changes, doing nothing"
else
if old_digest == ""
puts "=> There was no existing digest, uploading a new version of the archive"
else
puts "=> There were changes, uploading a new version of the archive"
puts " => Old checksum: #{old_digest}"
puts " => New checksum: #{bundle_digest}"
end
puts "=> Preparing bundle archive"
`cd ~ && tar -cjf #{file_name} .bundle && split -b 5m -a 3 #{file_name} #{file_name}.`
parts_pattern = File.expand_path(File.join("~", "#{file_name}.*"))
parts = Dir.glob(parts_pattern).sort
storage = Fog::Storage.new({
:provider => "AWS",
:aws_access_key_id => ENV["AWS_S3_KEY"],
:aws_secret_access_key => ENV["AWS_S3_SECRET"],
:region => ENV["AWS_S3_REGION"] || "us-east-1"
})
puts "=> Uploading the bundle"
puts " => Beginning multipart upload"
response = storage.initiate_multipart_upload bucket_name, file_name, { "x-amz-acl" => "public-read" }
upload_id = response.body['UploadId']
puts " => Upload ID: #{upload_id}"
part_ids = []
puts " => Uploading #{parts.length} parts"
parts.each_with_index do |part, index|
part_number = (index + 1).to_s
puts " => Uploading #{part}"
File.open part do |part_file|
response = storage.upload_part bucket_name, file_name, upload_id, part_number, part_file
part_ids << response.headers['ETag']
puts " => Uploaded"
end
end
puts " => Completing multipart upload"
storage.complete_multipart_upload bucket_name, file_name, upload_id, part_ids
puts "=> Uploading the digest file"
bucket = storage.directories.new(key: bucket_name)
bucket.files.create({
:body => bundle_digest,
:key => digest_filename,
:public => true,
:content_type => "text/plain"
})
end
puts "All done now."
exit 0
#!/bin/sh
ARCHITECTURE=`uname -m`
FILE_NAME="$BUNDLE_ARCHIVE-$ARCHITECTURE.tgz"
cd ~
wget -O "remote_$FILE_NAME" "https://$AWS_S3_BUCKET.s3.amazonaws.com/$FILE_NAME" && tar -xf "remote_$FILE_NAME"
wget -O "remote_$FILE_NAME.sha2" "https://$AWS_S3_BUCKET.s3.amazonaws.com/$FILE_NAME.sha2"
exit 0
@radixhound
Copy link

There is so much awesome here. It was so easy to get this working. Thanks!

@ArturT
Copy link

ArturT commented Jun 16, 2013

Thanks. I added example for ftp instead of s3 https://gist.github.com/ArturT/5792488

@bradherman
Copy link

I implemented this and builds didn't improve speed at all for some reason... Here's a snapshot of my logs from the gem install fog command down to my gem installation... I cut out like 90% of the gem installs. It IS backing up to S3, though. I see the .bundle tar in S3 after the builds.

https://gist.github.com/bradherman/64481c14b90c3936cf40

@bradherman
Copy link

If anything, my build appear to be slower because the "gem install bundler fog" command hangs on "building native extensions" for around 2 minutes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment