Skip to content

Instantly share code, notes, and snippets.

@rizerzero
Forked from philkuz/aws.py
Created January 7, 2019 10:17
Show Gist options
  • Save rizerzero/e3804c6466f05afe5230abc2cc31bf9d to your computer and use it in GitHub Desktop.
Save rizerzero/e3804c6466f05afe5230abc2cc31bf9d to your computer and use it in GitHub Desktop.
check S3 bucket exists with python
from aws import bucket_exists, upload_path
bucket_name = 'cnns-music-vids'
directory_to_upload = 'data/'
output_s3_directory = 'data/'
if bucket_exists(bucket_name):
print('the bucket exists!')
else:
raise ValueError('nah the bucket does not exist')
print('uploading directory {}'.format(directory_to_upload))
upload_path(directory_to_upload, bucket_name, output_s3_directory)
print('done uploading')
(aws_upload_demo) ☁ Music-Videos-CNN [master] ⚡ cat aws.py
import boto3
import os
# s3 = boto3.resource('s3')
# s3.meta.client.upload_file('tmp/', BUCKET_NAME, 'tmp/')
def bucket_exists(bucket):
s3 = boto3.resource('s3')
return s3.Bucket(bucket) in s3.buckets.all()
def upload_path(local_directory, bucket, destination, certain_upload=False):
client = boto3.client('s3')
# enumerate local files recursively
for root, dirs, files in os.walk(local_directory):
for filename in files:
# construct the full local path
local_path = os.path.join(root, filename)
# construct the full Dropbox path
relative_path = os.path.relpath(local_path, local_directory)
s3_path = os.path.join(destination, relative_path)
if certain_upload:
client.upload_file(local_path, bucket, s3_path)
return
print('Searching "%s" in "%s"' % (s3_path, bucket))
try:
client.head_object(Bucket=bucket, Key=s3_path)
# print("Path found on S3! Skipping %s..." % s3_path)
except:
print("Uploading %s..." % s3_path)
client.upload_file(local_path, bucket, s3_path)
from aws import bucket_exists, upload_path
bucket_name = 'cnns-music-vids'
directory_to_upload = 'data/'
output_s3_directory = 'data/'
if bucket_exists(bucket_name):
print('the bucket exists!')
else:
raise ValueError('nah the bucket does not exist')
print('uploading directory {}'.format(directory_to_upload))
upload_path(directory_to_upload, bucket_name, output_s3_directory)
print('done uploading')

Setting up

Make sure you are using an environment with python3 available

Install prereqs

pip install aws boto3
aws configure

Configure AWS

Make/grab your AWS access key and secret key from this link and then run aws configure as below. Just press enter on the default region name.

$ aws configure 
AWS Access Key ID [****************AAAA]: 
AWS Secret Access Key [****************AAAA]: 
Default region name [us-west-2]: 

Run sample script

I've provided a sample aws script called aws_script.py. Edit it to point to a test path (data/) and a real bucket name (cnn-music-vids) as well as an output directory (I'd keep it the same as your test_path). Then if everything goes well it should look something like this!

# edit aws-script.py
$ python aws_script.py
the bucket exists!
uploading directory data/
Searching "data/vid1.mp4" in "cnns-music-vids"
Uploading data/vid1.mp4...
Searching "data/vid2.mp4" in "cnns-music-vids"
Uploading data/vid2.mp4...
done uploading

Then you should see the following two pictures

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment