Skip to content

Instantly share code, notes, and snippets.

@jezhumble
Last active November 21, 2023 07:39
Show Gist options
  • Star 24 You must be signed in to star a gist
  • Fork 1 You must be signed in to fork a gist
  • Save jezhumble/91051485db4462add82045ef9ac2a0ec to your computer and use it in GitHub Desktop.
Save jezhumble/91051485db4462add82045ef9ac2a0ec to your computer and use it in GitHub Desktop.
# Copyright 2019 Google LLC.
# SPDX-License-Identifier: Apache-2.0
# This snippet shows you how to use Blob.generate_signed_url() from within compute engine / cloud functions
# as described here: https://cloud.google.com/functions/docs/writing/http#uploading_files_via_cloud_storage
# (without needing access to a private key)
# Note: as described in that page, you need to run your function with a service account
# with the permission roles/iam.serviceAccountTokenCreator
import os, google.auth
from google.auth.transport import requests
from google.auth import compute_engine
from datetime import datetime, timedelta
from google.cloud import storage
auth_request = requests.Request()
credentials, project = google.auth.default()
storage_client = storage.Client(project, credentials)
data_bucket = storage_client.lookup_bucket(os.getenv("BUCKET_NAME"))
signed_blob_path = data_bucket.blob("FILENAME")
expires_at_ms = datetime.now() + timedelta(minutes=30)
# This next line is the trick!
signing_credentials = compute_engine.IDTokenCredentials(auth_request, "", service_account_email=credentials.service_account_email)
signed_url = signed_blob_path.generate_signed_url(expires_at_ms, credentials=signing_credentials, version="v4")
@mezhaka
Copy link

mezhaka commented Nov 19, 2020

Does anyone know if it's OK to create IDTokenCredentials instance once and use it for the lifetime of the application? I have browsed the implementation a bit -- it has refresh method, which made me think that either something is going to call this method if the token is expired, or maybe I am supposed to call it?

@econtal
Copy link

econtal commented Jan 7, 2021

@mezhaka

However her workaround did not fix it in my case -- the value of storage_client._credentials.service_account_email was default in my case, despite the node that I was running it from had also a different service account.

Actually you need to do some API call with your client to automatically set service_account_email to the default email. So for instance this should work:

storage_client = storage.Client()
my_bucket = storage_client.get_bucket('my_bucket')
signing_credentials = compute_engine.IDTokenCredentials(
    auth_request,
    "",
    service_account_email=storage_client._credentials.service_account_email)

or if you don't like using a private attribute, you can be more explicit:

credentials, project = GGDefault()
storage_client = Client(project, credentials)
my_bucket = storage_client.get_bucket('my_bucket')
signing_credentials = compute_engine.IDTokenCredentials(
    auth_request,
    "",
    service_account_email=credentials.service_account_email)

@deven96
Copy link

deven96 commented Apr 22, 2021

""" Generating a downloadable GET link  (that expires in 30 minutes) for a  file in bucket '"""
from google.auth.transport import requests
from google.auth import compute_engine
from datetime import datetime, timedelta
from google.cloud import storage

auth_request = requests.Request()
storage_client = storage.Client()
data_bucket = storage_client.bucket("BUCKET_NAME")
blob = data_bucket.get_blob("FILENAME")
expires_at_ms = datetime.now() + timedelta(minutes=30)
signing_credentials = compute_engine.IDTokenCredentials(auth_request, "")
signed_url = blob.generate_signed_url(expires_at_ms, credentials=signing_credentials)

I am using AppEngine Python3 and this worked for me, however I had to add IAM Service Account Token Creator role to my AppEngine app default service account project-name@appspot.gserviceaccount.com, else it showed error INFO:root:Error calling the IAM signBytes API: b'{\n "error": {\n "code": 403,\n "message": "The caller does not have permission",\n "status": "PERMISSION_DENIED"\n }\n}\n' . The role apparently enables the account to be able to sign blobs

@nguaman
Copy link

nguaman commented Oct 22, 2021

Service Account Token Creator

This work for me in Google Cloud Run. (oct-2021)

@deven96 Thanks!

@igortxra
Copy link

Thanks @deven96 , this was very useful. Works for me too (oct-2021).

@deven96
Copy link

deven96 commented Oct 28, 2021

My pleasure! @nguaman @igortxra

@patrickchho
Copy link

patrickchho commented Dec 24, 2021

@deven96 this works - thank you! (dec-2021).

One more note for future readers. The solution works in the cloud env. In local env, I get google.auth.exceptions.TransportError:Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true from the Google Compute Enginemetadata service. To workaround in local, I use the service account JSON key.

@adriangb
Copy link

Someone please correct me if I'm wrong, but this ends up making an HTTP request each time it needs to sign right? And a synchronous one at that.

@mezhaka
Copy link

mezhaka commented Apr 25, 2022

@adriangb This is my understanding as well. But there's a way to sign completely offline AFAIR.

@adriangb
Copy link

Not without a private key, which is super problematic both for production and local development.

@saiayn
Copy link

saiayn commented Jul 9, 2023

If you need to accomplish this task within a Firebase Cloud function using .onCall(), please follow the steps outlined below:

from firebase_admin import storage
from google.auth import compute_engine
from google.auth.transport.requests import Request
from datetime import timedelta

def your_function(parameter):
    # Create an authentication request
    auth_request = Request()

    # Get your IDTokenCredentials
    signing_credentials = compute_engine.IDTokenCredentials(
        auth_request,
        "",
        service_account_email='<ADD YOUR SERVICE ACCOUNT MAIL(Principal)>'
    )

    # Get your storage bucket
    data_bucket = storage.bucket('<YOUR BUCKET>')
    
    # Generate a signed URL for your bucket
    blob = data_bucket.blob(parameter)
    url = blob.generate_signed_url(
        expiration=timedelta(days=7),
        credentials=signing_credentials, 
        version="v4"
    )
    
    return url

Remember to replace '<ADD YOUR SERVICE ACCOUNT MAIL(Principal)>' and '' with your actual service account email and bucket name, respectively.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment