Skip to content

Instantly share code, notes, and snippets.

@yosukehara
Created December 17, 2012 12:34
Show Gist options
  • Save yosukehara/4317971 to your computer and use it in GitHub Desktop.
Save yosukehara/4317971 to your computer and use it in GitHub Desktop.
I tried multipart-upload-api with Ruby. Sample code is as the follows:
require 'aws-sdk'
Endpoint = "localhost"
Port = 8080
class LeoFSHandler < AWS::Core::Http::NetHttpHandler
def handle(request, response)
request.port = ::Port
super
end
end
AWS.config(
:access_key_id => '05236',
:secret_access_key => '802562235',
s3_endpoint: Endpoint,
http_handler: LeoFSHandler.new,
s3_force_path_style: true,
use_ssl: false
)
file_path_for_multipart_upload = 'test-large-file.tar.gz'
bucket = AWS::S3.new.buckets['photo']
open(file_path_for_multipart_upload) do |file|
uploading_object = bucket.objects[File.basename(file.path)]
uploading_object.multipart_upload do |upload|
while !file.eof?
# upload.add_part(file.read 5242880) ## 5242880 : 5MB
upload.add_part(file.read 15728640) ## 15728640 : 15MB
p('Aborted') if upload.aborted?
end
end
end
@yosukehara
Copy link
Author

  • gateway/etc/app.config

    {leo_gateway, [
                   %% == Gateway Properties ==
                   {listener, leo_s3_http},
                   {layer_of_dirs, {1, 12} },
    
                   {s3_http, [
                              %% Use S3-API ?
                              {s3_api, true},
    
                              %% HTTP-Server: [cowboy]
                              {http_server, cowboy},
    
                              %% Gateway's port number
                              {port, 8080 },
    
                              %% # of acceptors
                              {num_of_acceptors, 128 },
    
                              %% == ssl related ==
                              {ssl_port,     8443 },
                              {ssl_certfile, "./etc/server_cert.pem" },
                              {ssl_keyfile,  "./etc/server_key.pem" },
    
                              %% == large-object related ==
                              {max_chunked_objs,      1000  }, ##  a number of  maximum chunked objects ##
                              {max_len_for_obj,       524288000   },
                              {chunked_obj_len,       5242880   },
                              {threshold_obj_len,     5767168 },
                              .
                              .
                              .
    
  • s3cmd

    $ s3cmd ls s3://photo
    2012-12-17 12:27 156286259   s3://photo/test-large-file.tar.gz
    
  • Curl command's result:

    $ curl -X GET http://photo.localhost:8080/test-large-file.tar.gz > new-test-large-file.tar.gz
    % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                   Dload  Upload   Total   Spent    Left  Speed
    100  149M    0  149M    0     0  45.0M      0 --:--:--  0:00:03 --:--:-- 45.0M
    

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment