Create a gist now

Instantly share code, notes, and snippets.

fog - resume uploading example

setup by placing resume.rb and Gemfile in a directory together, then run bundle install

You can configure the running of this script with ENV variables. In every case specify DIRECTORY_NAME, FILE, and PROVIDER and then just add the appropriate credentials for the selected provider

Amazon S3: DIRECTORY_NAME=fogresumes FILE=resume.html PROVIDER=AWS AWS_ACCESS_KEY_ID=XXX AWS_SECRET_ACCESS_KEY=YYY bundle exec ruby resume.rb

Google Storage For Developers: DIRECTORY_NAME=fogresumes FILE=resume.html PROVIDER=Google GOOGLE_STORAGE_ACCESS_KEY_ID=XXX GOOGLE_STORAGE_SECRET_ACCESS_KEY=YYY bundle exec ruby resume.rb

Local: DIRECTORY_NAME=fogresumes FILE=resume.html PROVIDER=Local LOCAL_ROOT=XXX bundle exec ruby resume.rb

Rackspace Cloud Files: DIRECTORY_NAME=fogresumes FILE=resume.html PROVIDER=Rackspace RACKSPACE_API_KEY_ID=XXX RACKSPACE_USERNAME=YYY bundle exec ruby resume.rb

You can also add MOCK=true if you'd like to run in a mocked mode (except for Rackspace which doesn't have mocks yet, contributions are welcome). Just note that in mock mode the url it lists won't actually work.

source ""
gem 'fog'
#!/usr/bin/env ruby
require 'rubygems'
require 'fog'
if ENV['MOCK'] == 'true'
# pull valid configuration values from ENV and map them to lowercase symbols
credentials = {}
for key, value in ENV
valid_keys = [
if valid_keys.include?(key)
credentials[key.downcase.to_sym] = value
# ensure provider was set
unless credentials[:provider]
raise'Please specify PROVIDER to use, choose one of [AWS, Google, Local, Rackspace]')
# pull directory name from ENV
unless directory_name = ENV['DIRECTORY_NAME']
raise'Please specify DIRECTORY_NAME to upload to')
# pull file from ENV to set file data and name
unless file = ENV['FILE']
raise'Please specify FILE to upload')
file_path = File.expand_path(file)
file_name = file_path.split(File::SEPARATOR).last
file_data =
Formatador.display_line # header padding
# create connection from credentials
Formatador.display_lines("Creating connection [bold]#{credentials.inspect}[/]")
connection =
# find directory or create it if it doesn't exist
directory = connection.directories.get(directory_name)
rescue Excon::Errors::Forbidden
raise"DIRECTORY_NAME is already in use by another user, please change DIRECTORY_NAME and try again")
if directory
Formatador.display_line("Found directory [bold]#{directory_name}[/]")
directory = connection.directories.create(
:key => directory_name,
:public => true
Formatador.display_line("Created directory [bold]#{directory_name}[/]")
Formatador.display_line("Creating file [bold]#{file_name}[/]")
file = directory.files.create(
:key => file_name,
:body => file_data,
:public => true
unless ENV['PROVIDER'] == 'Local'
Formatador.display_line("File is available at [negative]#{file.public_url}[/]")
path = File.join(credentials[:local_root], directory_name, file_name)
Formatador.display_line("File is available at [negative]#{path}[/]")
Formatador.display("Press return to begin cleanup...")
Formatador.display_line("Deleting file [bold]#{file_name}[/]")
Formatador.display_line("Deleting directory [bold]#{directory_name}[/]")
Formatador.display_line # footer padding

have you implemented the s3.get_link which provides a temporary URL to a private file?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment