Skip to content

Instantly share code, notes, and snippets.

Embed
What would you like to do?
# Copyright 2019 Google LLC.
# SPDX-License-Identifier: Apache-2.0
# This snippet shows you how to use Blob.generate_signed_url() from within compute engine / cloud functions
# as described here: https://cloud.google.com/functions/docs/writing/http#uploading_files_via_cloud_storage
# (without needing access to a private key)
# Note: as described in that page, you need to run your function with a service account
# with the permission roles/iam.serviceAccountTokenCreator
import os, google.auth
from google.auth.transport import requests
from google.auth import compute_engine
from datetime import datetime, timedelta
from google.cloud import storage
auth_request = requests.Request()
credentials, project = google.auth.default()
storage_client = storage.Client(project, credentials)
data_bucket = storage_client.lookup_bucket(os.getenv("BUCKET_NAME"))
signed_blob_path = data_bucket.blob("FILENAME")
expires_at_ms = datetime.now() + timedelta(minutes=30)
# This next line is the trick!
signing_credentials = compute_engine.IDTokenCredentials(auth_request, "", service_account_email=credentials.service_account_email)
signed_url = signed_blob_path.generate_signed_url(expires_at_ms, credentials=signing_credentials, version="v4")
@jezhumble

This comment has been minimized.

Copy link
Owner Author

commented Apr 21, 2019

Also, just a note that you can't use the signed url from a browser. The right way to upload a file to GCS from a browser is to use a resumable upload as described here: https://cloud.google.com/storage/docs/xml-api/resumable-upload. You can get the URL to pass to a client running in a browser using this API call: https://google-cloud.readthedocs.io/en/latest/storage/blobs.html#google.cloud.storage.blob.Blob.create_resumable_upload_session

@JayGoldberg

This comment has been minimized.

Copy link

commented May 7, 2019

@jezhumble, can you explain info that this wouldn't work in a browser? It's a normal POST/PUT operation to the returned URL with a Host: header. A to the signed URL is the same as any XMLHttpRequest.

If it can be used in curl, it can be used in the browser. The HTTP verb you use is defined when the signed URL is generated with generate_signed_url() [1].

Many applications use signed URLs to grant the user (browser) temporary access to resources in buckets that they own using GET, and PUT/POST operations are only marginally different AFAIK.

[1] https://googleapis.github.io/google-cloud-python/latest/storage/blobs.html

@btoueg

This comment has been minimized.

Copy link

commented May 9, 2019

You can remove the tricky line:

signing_credentials = compute_engine.IDTokenCredentials(
    auth_request,
    "",
    service_account_email=credentials.service_account_email,
)

by reusing the credentials from the storage client:

signed_url = signed_blob_path.generate_signed_url(expires_at_ms, credentials=client._credentials)
@RoudyBob

This comment has been minimized.

Copy link

commented Jul 25, 2019

Hi @btroueg...trying to use your workround here. Can you verify the simplifed sign_url line you're using that allws you to remove the tricky line above? Seems as if it's referring to the client object which is not defined above?

@jessjenk

This comment has been minimized.

Copy link

commented Aug 26, 2019

Thanks for this tip, super-helpful!

For my Cloud Function, I use a service account that's not the default, so I had problems using the credentials returned by google.auth.default() and was getting an error from the generate_signed_url call:

TransportError: Error calling the IAM signBytes API: b'{\n "error": {\n "code": 400,\n "message": "Invalid service account email (default).",\n "status": "INVALID_ARGUMENT"\n }\n}\n'

I fixed it by making sure I was using my service account's email to get the ID token:

from google.cloud import storage

# Initialize client using the project that the Cloud Function was deployed to,
# and the per-function service account
# https://cloud.google.com/functions/docs/securing/function-identity#per-function_identity
storage_client = storage.Client()
# ...
signing_credentials = compute_engine.IDTokenCredentials(
    auth_request,
    "",
    # You can also do 
    #     service_account_email=os.environ["FUNCTION_IDENTITY"]
    # if using the _credentials variable that's not part of the public interface icks you out.
    service_account_email=storage_client._credentials.service_account_email,
)
# ...
@abirafdirp

This comment has been minimized.

Copy link

commented Sep 10, 2019

Hi all, (thank you for the gist)

I'm using App Engine Gen 2 (Py 3.7).
I found that after some debugging, even GAE calls gce credentials provider in the google.auth.default.
So the whole default credentials not be able to sign is even an issue in App Engine (only Gen 2).
I need to remove the service_account arg since the email in the google.auth.default credentials is not a valid email (just "default"), which it will tries to get proper email from GCE creds providers.

My problem is as follow.
As of now, setting aud to empty string does not work.
I get something along the lines empty scope is not allowed.

Now I don't know much about aud, I've tried to put the app engine's URL in it and also tried https://www.googleapis.com/auth/devstorage.read_only. Using those audience raises invalid credentials error when calling storage googleapis.

I think for me, latest resort is just to mount the private key into the app engine instance. Which is a shame.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.