Skip to content

Instantly share code, notes, and snippets.

@slotrans
Created March 27, 2013 21:26
Show Gist options
  • Save slotrans/5258159 to your computer and use it in GitHub Desktop.
Save slotrans/5258159 to your computer and use it in GitHub Desktop.
Quick python script for multipart s3 file uploads, in case you need to upload a file larger than 5GB. Split up your file using something like /usr/bin/split, then invoke this as s3_multipart_upload.py targetbucket targetfilename part1 part2 part3 (or a glob like part*). AWS credentials are taken from the environment variables AWS_ACCESS_KEY_ID a…
import boto
import sys
bucketname = sys.argv[1]
filename = sys.argv[2]
parts = sys.argv[3:]
print('target=s3://{0}/{1}'.format(bucketname, filename))
conn = boto.connect_s3()
bucket = conn.lookup(bucketname)
mp = bucket.initiate_multipart_upload(filename)
partnum = 0
for localfile in parts:
with open(localfile, 'rb') as fp:
partnum += 1
print('uploading part #{0} from {1}...'.format(partnum, localfile))
mp.upload_part_from_file(fp, partnum)
print('\nsummary...')
print('part number: bytes')
for part in mp:
print('{0}: {1}'.format(part.part_number, part.size))
mp.complete_upload()
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment