Skip to content

Instantly share code, notes, and snippets.

@mikesparr
Last active November 2, 2023 11:41
Show Gist options
  • Star 4 You must be signed in to star a gist
  • Fork 1 You must be signed in to fork a gist
  • Save mikesparr/6e0f0f0db059c5bc2084a35631b19248 to your computer and use it in GitHub Desktop.
Save mikesparr/6e0f0f0db059c5bc2084a35631b19248 to your computer and use it in GitHub Desktop.
Example setting up aggregate log sink for Audit Logs on Google Cloud Platform (GCP) shipping to BigQuery
#!/usr/bin/env bash
#####################################################################
# REFERENCES
# - https://cloud.google.com/logging/docs/export/aggregated_sinks
# - https://cloud.google.com/bigquery/docs/datasets#bq
# - https://cloud.google.com/bigquery/docs/access-control-basic-roles
#####################################################################
export PROJECT_ID=$(gcloud config get-value project)
export PROJECT_USER=$(gcloud config get-value core/account) # set current user
export PROJECT_NUMBER=$(gcloud projects describe $PROJECT_ID --format="value(projectNumber)")
export IDNS=${PROJECT_ID}.svc.id.goog # workload identity domain
export GCP_REGION="us-central1" # CHANGEME (OPT)
export GCP_ZONE="us-central1-a" # CHANGEME (OPT)
#####################################################################
# FOLDERS (create a folder for experiment)
#####################################################################
export SANDBOX_FOLDER=$FOLDER # created previously (DoiT-specific)
export DEMO_FOLDER_NAME="products"
gcloud resource-manager folders create \
--display-name=$DEMO_FOLDER_NAME \
--folder=$SANDBOX_FOLDER
export DEMO_FOLDER_ID=891980021895 # make note of folder ID after create (folders/<folder-id>)
#####################################################################
# SECURITY PROJECT (where security team access only and logs ship to)
#####################################################################
export BILLING_ACCOUNT_ID=$BILLING # created previously
export SECURITY_PROJECT_ID="example-security"
# security
gcloud projects create $SECURITY_PROJECT_ID \
--folder $DEMO_FOLDER_ID
gcloud beta billing projects link $SECURITY_PROJECT_ID \
--billing-account=$BILLING_ACCOUNT_ID
gcloud services enable compute.googleapis.com \
storage.googleapis.com \
bigquery.googleapis.com \
--project $SECURITY_PROJECT_ID
# disable deletion (key project)
gcloud alpha resource-manager liens create \
--restrictions=resourcemanager.projects.delete \
--reason="Contains audit logs and sensitive data" \
--project $SECURITY_PROJECT_ID
# disable public buckets ( storage.publicAccessPrevention )
gcloud resource-manager org-policies enable-enforce \
--project $SECURITY_PROJECT_ID \
storage.publicAccessPrevention
#####################################################################
# BIG QUERY LOG SINK (aggregate audit logs for all projects in folder)
#####################################################################
export DATASET_ID="audit_logs"
export SINK_NAME="audit-log-sink"
export LOG_FILTER="protoPayload.@type:\"type.googleapis.com/google.cloud.audit.AuditLog\""
# create bigquery dataset in security project
bq --location=US mk -d \
--description "Audit log sink" \
--project_id $SECURITY_PROJECT_ID \
$DATASET_ID
# create aggregate log sink on demo folder -> bq dataset
gcloud logging sinks create $SINK_NAME \
bigquery.googleapis.com/projects/$SECURITY_PROJECT_ID/datasets/$DATASET_ID \
--include-children \
--folder=$DEMO_FOLDER_ID \
--log-filter=$LOG_FILTER
# get the sink service account ID to grant editor role to BigQuery
export SINK_SA=$(gcloud logging sinks describe $SINK_NAME --folder $DEMO_FOLDER_ID --format="value(writerIdentity)")
echo "Captured log sink SA: $SINK_SA"
# add IAM role to security project for sink SA
gcloud projects add-iam-policy-binding $SECURITY_PROJECT_ID \
--member=$SINK_SA --role=roles/bigquery.dataEditor
#####################################################################
# EXAMPLE PROJECTS (will ship logs to sink)
#####################################################################
export EXAMPLE_PROJECT_1="mike-example-project-1"
export EXAMPLE_PROJECT_2="mike-example-project-2"
# project 1
gcloud projects create $EXAMPLE_PROJECT_1 \
--folder $DEMO_FOLDER_ID
gcloud beta billing projects link $EXAMPLE_PROJECT_1 \
--billing-account=$BILLING_ACCOUNT_ID
gcloud services enable compute.googleapis.com \
pubsub.googleapis.com \
--project $EXAMPLE_PROJECT_1
# project 2
gcloud projects create $EXAMPLE_PROJECT_2 \
--folder $DEMO_FOLDER_ID
gcloud beta billing projects link $EXAMPLE_PROJECT_2 \
--billing-account=$BILLING_ACCOUNT_ID
gcloud services enable compute.googleapis.com \
storage.googleapis.com \
--project $EXAMPLE_PROJECT_2
# check BQ dataset and confirm entries appear
#####################################################################
# GENERATE TERRAFORM FOR THE EXAMPLE ABOVE
# - https://registry.terraform.io/providers/hashicorp/google/latest/docs/resources/logging_folder_sink
# - https://registry.terraform.io/providers/hashicorp/google/latest/docs/resources/logging_organization_sink
#####################################################################
gcloud config set project $SECURITY_PROJECT_ID
gcloud services enable cloudasset.googleapis.com
gcloud beta resource-config bulk-export \
--folder=$DEMO_FOLDER_ID \
--resource-format=terraform > main.tf
# fetch relevant resources from main.tf file and modularize as needed
@mikesparr
Copy link
Author

mikesparr commented Feb 25, 2022

Overview

For compliance and just peace of mind, it's a good practice to configure either at the Organization or Folder (this example) level a log sink that ships audit logs to another destination for storage and/or analysis. This way if a bad actor compromises one of your environments, they will not be able to cover their tracks, and you can remain compliant with various standards organizations.

Demo

This demonstration creates a demo folder products and three projects:

  • example-security (audit log destination to BigQuery dataset)
  • mike-example-project-1 (simulate some team project)
  • mike-example-project-2 (simulate some team project)

After creating the security project, a BigQuery dataset is created to receive the log data from other projects. Then an aggregate log sink set at the folder level (you can set at Org, Folder, or Project) will collect all audit logs for the folder and any projects inside of it.

Result

As you can see in snaps below, even as I created the projects they began shipping audit log data to the BigQuery dataset stored in a separate project. I added a lien to prevent project deletion, and can restrict whom gains IAM access to that project for complete control. Furthermore, with granular access controls for BigQuery, you could fine-tune access as desired.

Screen Shot 2022-02-25 at 4 22 07 PM

Screen Shot 2022-02-25 at 4 21 39 PM

@mikesparr
Copy link
Author

mikesparr commented Feb 25, 2022

Terraform config


#####################################################################
# GENERATE TERRAFORM FOR THE EXAMPLE ABOVE
#####################################################################
gcloud config set project $SECURITY_PROJECT_ID
gcloud services enable cloudasset.googleapis.com

gcloud beta resource-config bulk-export \
    --folder=$DEMO_FOLDER_ID \
    --resource-format=terraform > main.tf

This section of code will generate TF resources in a file main.tf that you could then modularize and incorporate into your own IaC configurations. Given we implemented an aggregate log sink, however, all projects you create will automatically begin shipping log data to your security project (I usually just name security and place it outside others in a folder called shared-services)

Enjoy!!!

Relevant resources

resource "google_bigquery_dataset" "audit_logs" {
  access {
    role          = "OWNER"
    special_group = "projectOwners"
  }
  access {
    role          = "OWNER"
    user_by_email = "you@youremail.com"
  }
  access {
    role          = "READER"
    special_group = "projectReaders"
  }
  access {
    role          = "WRITER"
    special_group = "projectWriters"
  }
  dataset_id                 = "audit_logs"
  delete_contents_on_destroy = false
  description                = "Audit log sink"
  location                   = "US"
  project                    = "example-security"
}

resource "google_logging_folder_sink" "audit-log-sink" {
  name   = "audit-log-sink"
  description = "Audit log sink"
  folder = google_folder.products.name

  # Can export to pubsub, cloud storage, or bigquery
  destination = "bigquery.googleapis.com/projects/example-security/datasets/${google_bigquery_dataset.audit_logs.name}"

  # Log all audit log types in projects within folder
  filter = "protoPayload.@type:\"type.googleapis.com/google.cloud.audit.AuditLog\""
}

resource "google_project_iam_binding" "bq-data-editor" {
  project = "example-security"
  role = "roles/bigquery.dataEditor"

  members = [
    google_logging_folder_sink.audit-log-sink.writer_identity,
  ]
}

@mikesparr
Copy link
Author

Additional tip

If you are shipping audit logs via the sink, you could add an exclusion filter on your other Cloud Logging sinks (i.e. _Default) to ignore audit logs and avoid double cost.

@mikesparr
Copy link
Author

Cleanup

gcloud projects delete $EXAMPLE_PROJECT_1
gcloud projects delete $EXAMPLE_PROJECT_2
gcloud alpha resource-manager liens list # copy the ID
gcloud alpha resource-manager liens delete <paste ID>
gcloud projects delete $SECURITY_PROJECT_ID
gcloud resource-manager folders delete $DEMO_FOLDER_ID

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment