Storing Google Cloud Platform (GCP) Credentials in AWS Secrets Manager for Apache Airflow to connect to BigQuery
This took me a while to figure out by looking at the code & docs. Most Google searches were directing me to useless articles (usually about storing the creds in GCP Secrets Manager)
- Create a service account (or other account)
- Create a new key and download the JSON file
- Open the file & copy the entire JSON into the clipboard
- Create an AWS Secret (name it appropriately, let's say
airflow/connection/mydag/gcp
- Set the key to
extra
- Set the value to the contents to:
{ "extra__google_cloud_platform__keyfile_dict": "{ <contents of your key-file }" }
- Set the key to
- Make sure you set the Airflow secrets location to the AWS Secrets Manager backend
- You should use a prefix (we're using the prefix
airflow/connection
for connections)
- You should use a prefix (we're using the prefix
- Now you can use connection name:
mydag/gcp
from your DAG to connect to BigQuery