Skip to content

Instantly share code, notes, and snippets.

@gleicon
Last active October 11, 2023 14:27
Show Gist options
  • Save gleicon/2b8acb9f9c0f22753eaac227ff997b34 to your computer and use it in GitHub Desktop.
Save gleicon/2b8acb9f9c0f22753eaac227ff997b34 to your computer and use it in GitHub Desktop.
How to use boto3 with google cloud storage and python to emulate s3 access.
from boto3.session import Session
from botocore.client import Config
from botocore.handlers import set_list_objects_encoding_type_url
import boto3
ACCESS_KEY = "xx"
SECRET_KEY = "yy"
boto3.set_stream_logger('')
session = Session(aws_access_key_id=ACCESS_KEY,
aws_secret_access_key=SECRET_KEY,
region_name="US-CENTRAL1")
session.events.unregister('before-parameter-build.s3.ListObjects',
set_list_objects_encoding_type_url)
s3 = session.resource('s3', endpoint_url='https://storage.googleapis.com',
config=Config(signature_version='s3v4'))
bucket = s3.Bucket('yourbucket')
for f in bucket.objects.all():
print(f.key)
@stoyanK7
Copy link

I managed to get it working. The comment under https://stackoverflow.com/a/21028609/9553927 helped:

The signed urls worked for me. Although, I've tried to call generate_url() with the parameter response_headers and the value response-content-disposition but I got malformed signed urls. So my solution has been to concatenate '&response-content-disposition=attachment%3B%20filename%3D"{}"'.format(file_name) to the signed url and it worked.

params = {
    "Bucket": "xyz",
    "Key": blob_name,
}
ten_minutes = 600  # seconds
url = self.s3_client.generate_presigned_url(
    "get_object", Params=params, ExpiresIn=ten_minutes
).replace("AWSAccessKeyId", "GoogleAccessId")
url += '&response-content-disposition=attachment;filename="newFileName"'

@gleicon
Copy link
Author

gleicon commented Oct 11, 2023

Awesome, thanks for sharing !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment