Skip to content

Instantly share code, notes, and snippets.

@rkoshy
Created January 14, 2024 13:27
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save rkoshy/eda0e27a673817af1389b16bdd449432 to your computer and use it in GitHub Desktop.
Save rkoshy/eda0e27a673817af1389b16bdd449432 to your computer and use it in GitHub Desktop.
Setting up Airflow with GCP (Google Cloud) Credentials stored in AWS Secrets Manager

Storing Google Cloud Platform (GCP) Credentials in AWS Secrets Manager for Apache Airflow to connect to BigQuery

This took me a while to figure out by looking at the code & docs. Most Google searches were directing me to useless articles (usually about storing the creds in GCP Secrets Manager)

  1. Create a service account (or other account)
  2. Create a new key and download the JSON file
  3. Open the file & copy the entire JSON into the clipboard
  4. Create an AWS Secret (name it appropriately, let's say airflow/connection/mydag/gcp
    • Set the key to extra
    • Set the value to the contents to:
    {
     "extra__google_cloud_platform__keyfile_dict": "{ <contents of your key-file }"
    }
  5. Make sure you set the Airflow secrets location to the AWS Secrets Manager backend
    • You should use a prefix (we're using the prefix airflow/connection for connections)
  6. Now you can use connection name: mydag/gcp from your DAG to connect to BigQuery
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment