Skip to content

Instantly share code, notes, and snippets.

@jezhumble
Last active November 21, 2023 07:39
Show Gist options
  • Star 24 You must be signed in to star a gist
  • Fork 1 You must be signed in to fork a gist
  • Save jezhumble/91051485db4462add82045ef9ac2a0ec to your computer and use it in GitHub Desktop.
Save jezhumble/91051485db4462add82045ef9ac2a0ec to your computer and use it in GitHub Desktop.
# Copyright 2019 Google LLC.
# SPDX-License-Identifier: Apache-2.0
# This snippet shows you how to use Blob.generate_signed_url() from within compute engine / cloud functions
# as described here: https://cloud.google.com/functions/docs/writing/http#uploading_files_via_cloud_storage
# (without needing access to a private key)
# Note: as described in that page, you need to run your function with a service account
# with the permission roles/iam.serviceAccountTokenCreator
import os, google.auth
from google.auth.transport import requests
from google.auth import compute_engine
from datetime import datetime, timedelta
from google.cloud import storage
auth_request = requests.Request()
credentials, project = google.auth.default()
storage_client = storage.Client(project, credentials)
data_bucket = storage_client.lookup_bucket(os.getenv("BUCKET_NAME"))
signed_blob_path = data_bucket.blob("FILENAME")
expires_at_ms = datetime.now() + timedelta(minutes=30)
# This next line is the trick!
signing_credentials = compute_engine.IDTokenCredentials(auth_request, "", service_account_email=credentials.service_account_email)
signed_url = signed_blob_path.generate_signed_url(expires_at_ms, credentials=signing_credentials, version="v4")
@jezhumble
Copy link
Author

Also, just a note that you can't use the signed url from a browser. The right way to upload a file to GCS from a browser is to use a resumable upload as described here: https://cloud.google.com/storage/docs/xml-api/resumable-upload. You can get the URL to pass to a client running in a browser using this API call: https://google-cloud.readthedocs.io/en/latest/storage/blobs.html#google.cloud.storage.blob.Blob.create_resumable_upload_session

@JayGoldberg
Copy link

@jezhumble, can you explain info that this wouldn't work in a browser? It's a normal POST/PUT operation to the returned URL with a Host: header. A to the signed URL is the same as any XMLHttpRequest.

If it can be used in curl, it can be used in the browser. The HTTP verb you use is defined when the signed URL is generated with generate_signed_url() [1].

Many applications use signed URLs to grant the user (browser) temporary access to resources in buckets that they own using GET, and PUT/POST operations are only marginally different AFAIK.

[1] https://googleapis.github.io/google-cloud-python/latest/storage/blobs.html

@btoueg
Copy link

btoueg commented May 9, 2019

You can remove the tricky line:

signing_credentials = compute_engine.IDTokenCredentials(
    auth_request,
    "",
    service_account_email=credentials.service_account_email,
)

by reusing the credentials from the storage client:

signed_url = signed_blob_path.generate_signed_url(expires_at_ms, credentials=client._credentials)

@RoudyBob
Copy link

Hi @btroueg...trying to use your workround here. Can you verify the simplifed sign_url line you're using that allws you to remove the tricky line above? Seems as if it's referring to the client object which is not defined above?

@jessjenk
Copy link

Thanks for this tip, super-helpful!

For my Cloud Function, I use a service account that's not the default, so I had problems using the credentials returned by google.auth.default() and was getting an error from the generate_signed_url call:

TransportError: Error calling the IAM signBytes API: b'{\n "error": {\n "code": 400,\n "message": "Invalid service account email (default).",\n "status": "INVALID_ARGUMENT"\n }\n}\n'

I fixed it by making sure I was using my service account's email to get the ID token:

from google.cloud import storage

# Initialize client using the project that the Cloud Function was deployed to,
# and the per-function service account
# https://cloud.google.com/functions/docs/securing/function-identity#per-function_identity
storage_client = storage.Client()
# ...
signing_credentials = compute_engine.IDTokenCredentials(
    auth_request,
    "",
    # You can also do 
    #     service_account_email=os.environ["FUNCTION_IDENTITY"]
    # if using the _credentials variable that's not part of the public interface icks you out.
    service_account_email=storage_client._credentials.service_account_email,
)
# ...

@abirafdirp
Copy link

abirafdirp commented Sep 10, 2019

Hi all, (thank you for the gist)

I'm using App Engine Gen 2 (Py 3.7).
I found that after some debugging, even GAE calls gce credentials provider in the google.auth.default.
So the whole default credentials not be able to sign is even an issue in App Engine (only Gen 2).
I need to remove the service_account arg since the email in the google.auth.default credentials is not a valid email (just "default"), which it will tries to get proper email from GCE creds providers.

My problem is as follow.
As of now, setting aud to empty string does not work.
I get something along the lines empty scope is not allowed.

Now I don't know much about aud, I've tried to put the app engine's URL in it and also tried https://www.googleapis.com/auth/devstorage.read_only. Using those audience raises invalid credentials error when calling storage googleapis.

I think for me, latest resort is just to mount the private key into the app engine instance. Which is a shame.

@raphet
Copy link

raphet commented Nov 4, 2019

This above script did create signatures but never worked for actually using them in PUT file commands with this error:

SignatureDoesNotMatch

This is a misleading error message, because the signature is not the problem. What actually was missing, is the correct "host" header which is for your google bucket:

host: storage.googleapis.com

Here is an example:

curl --request PUT \
  --url 'yourPresignedURL' \
  --header 'content-type: image/jpeg' \
  --header 'host: storage.googleapis.com' \
  --data 'yourBinaryContent'

I hope this saves someone out there some trouble.

@EricPHassey
Copy link

Tried this, unfortunately still getting this error message. Any solutions or anyone else experience this?

AttributeError: you need a private key to sign credentials.the credentials you are currently using <class 'google.auth.compute_engine.credentials.Credentials'> just contains a token. see https://googleapis.dev/python/google-api-core/latest/auth.html#setting-up-a-service-account for more details.

@jbn
Copy link

jbn commented Jan 25, 2020

This is a misleading error message, because the signature is not the problem. What actually was missing, is the correct "host" header which is for your google bucket:

For me, I needed to do,

signed_url = signed_blob_path.generate_signed_url(expires_at_ms, credentials=signing_credentials, version="v4", method='PUT')

@sparkinson
Copy link

sparkinson commented May 14, 2020

I recently had problems getting this working. Eventually, I managed to create credentials that can sign blobs using:

import google.auth
from google.auth import compute_engine

auth_request = google.auth.transport.requests.Request()
signing_credentials = compute_engine.IDTokenCredentials(auth_request, "") # this uses the default credentials by default, no need to pass service_account_email
signing_credentials.signer.sign(string_to_sign)

This uses the default credentials for the compute instance.

Note that you need to add permissions to the default credentials to be able to sign URLs, and to enable the IAM service in your project:

gcloud iam service-accounts add-iam-policy-binding --role=roles/iam.serviceAccountTokenCreator "$PROJ_NUMBER-compute@developer.gserviceaccount.com" --member="serviceAccount:$PROJ_NUMBER-compute@developer.gserviceaccount.com"
gcloud services enable iam.googleapis.com

The methods stated above were not working for me as the credentials obtained using google.auth.default() of client_credentials had not refreshed, and so did not actually contain the service account email address. Attempts to refresh threw scope issues which I did not know how to solve.

@0zeroth
Copy link

0zeroth commented Oct 11, 2020

If you encounter the TypeError: 'Request' object is not callable - you are using requests.Request() and not google.auth.transport.requests.Request()

Hoping this saves someone else from chasing the obvious for far too long...

@meetchandan
Copy link

Hi,

I want to create a signed url using which I can upload a file to GCS.
Is just putting "POST" as the method parameter value enough?

@mezhaka
Copy link

mezhaka commented Nov 17, 2020

I had a similar issue to what @jessjenk described above:

TransportError: Error calling the IAM signBytes API: b'{\n "error": {\n "code": 400,\n "message": "Invalid service account email (default).",\n "status": "INVALID_ARGUMENT"\n }\n}\n'

However her workaround did not fix it in my case -- the value of storage_client._credentials.service_account_email was default in my case, despite the node that I was running it from had also a different service account. I had to explicitly put the email of the service account to make it work:

        signing_credentials = compute_engine.IDTokenCredentials(
            auth_request,
            "",
            service_account_email="crazy@home.iam.gserviceaccount.com",
        )

@mezhaka
Copy link

mezhaka commented Nov 19, 2020

Does anyone know if it's OK to create IDTokenCredentials instance once and use it for the lifetime of the application? I have browsed the implementation a bit -- it has refresh method, which made me think that either something is going to call this method if the token is expired, or maybe I am supposed to call it?

@econtal
Copy link

econtal commented Jan 7, 2021

@mezhaka

However her workaround did not fix it in my case -- the value of storage_client._credentials.service_account_email was default in my case, despite the node that I was running it from had also a different service account.

Actually you need to do some API call with your client to automatically set service_account_email to the default email. So for instance this should work:

storage_client = storage.Client()
my_bucket = storage_client.get_bucket('my_bucket')
signing_credentials = compute_engine.IDTokenCredentials(
    auth_request,
    "",
    service_account_email=storage_client._credentials.service_account_email)

or if you don't like using a private attribute, you can be more explicit:

credentials, project = GGDefault()
storage_client = Client(project, credentials)
my_bucket = storage_client.get_bucket('my_bucket')
signing_credentials = compute_engine.IDTokenCredentials(
    auth_request,
    "",
    service_account_email=credentials.service_account_email)

@deven96
Copy link

deven96 commented Apr 22, 2021

""" Generating a downloadable GET link  (that expires in 30 minutes) for a  file in bucket '"""
from google.auth.transport import requests
from google.auth import compute_engine
from datetime import datetime, timedelta
from google.cloud import storage

auth_request = requests.Request()
storage_client = storage.Client()
data_bucket = storage_client.bucket("BUCKET_NAME")
blob = data_bucket.get_blob("FILENAME")
expires_at_ms = datetime.now() + timedelta(minutes=30)
signing_credentials = compute_engine.IDTokenCredentials(auth_request, "")
signed_url = blob.generate_signed_url(expires_at_ms, credentials=signing_credentials)

I am using AppEngine Python3 and this worked for me, however I had to add IAM Service Account Token Creator role to my AppEngine app default service account project-name@appspot.gserviceaccount.com, else it showed error INFO:root:Error calling the IAM signBytes API: b'{\n "error": {\n "code": 403,\n "message": "The caller does not have permission",\n "status": "PERMISSION_DENIED"\n }\n}\n' . The role apparently enables the account to be able to sign blobs

@nguaman
Copy link

nguaman commented Oct 22, 2021

Service Account Token Creator

This work for me in Google Cloud Run. (oct-2021)

@deven96 Thanks!

@igortxra
Copy link

Thanks @deven96 , this was very useful. Works for me too (oct-2021).

@deven96
Copy link

deven96 commented Oct 28, 2021

My pleasure! @nguaman @igortxra

@patrickchho
Copy link

patrickchho commented Dec 24, 2021

@deven96 this works - thank you! (dec-2021).

One more note for future readers. The solution works in the cloud env. In local env, I get google.auth.exceptions.TransportError:Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true from the Google Compute Enginemetadata service. To workaround in local, I use the service account JSON key.

@adriangb
Copy link

Someone please correct me if I'm wrong, but this ends up making an HTTP request each time it needs to sign right? And a synchronous one at that.

@mezhaka
Copy link

mezhaka commented Apr 25, 2022

@adriangb This is my understanding as well. But there's a way to sign completely offline AFAIR.

@adriangb
Copy link

Not without a private key, which is super problematic both for production and local development.

@saiayn
Copy link

saiayn commented Jul 9, 2023

If you need to accomplish this task within a Firebase Cloud function using .onCall(), please follow the steps outlined below:

from firebase_admin import storage
from google.auth import compute_engine
from google.auth.transport.requests import Request
from datetime import timedelta

def your_function(parameter):
    # Create an authentication request
    auth_request = Request()

    # Get your IDTokenCredentials
    signing_credentials = compute_engine.IDTokenCredentials(
        auth_request,
        "",
        service_account_email='<ADD YOUR SERVICE ACCOUNT MAIL(Principal)>'
    )

    # Get your storage bucket
    data_bucket = storage.bucket('<YOUR BUCKET>')
    
    # Generate a signed URL for your bucket
    blob = data_bucket.blob(parameter)
    url = blob.generate_signed_url(
        expiration=timedelta(days=7),
        credentials=signing_credentials, 
        version="v4"
    )
    
    return url

Remember to replace '<ADD YOUR SERVICE ACCOUNT MAIL(Principal)>' and '' with your actual service account email and bucket name, respectively.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment