Skip to content

Instantly share code, notes, and snippets.

@polleyg
Created September 6, 2018 11:47
Show Gist options
  • Save polleyg/16d28373751cba95c78de0a1747f7285 to your computer and use it in GitHub Desktop.
Save polleyg/16d28373751cba95c78de0a1747f7285 to your computer and use it in GitHub Desktop.
A Cloud Build configuration
steps:
# 1. Fetch the source code
- name: gcr.io/cloud-builders/git
args: ['clone', 'https://github.com/polleyg/gcp-batch-ingestion-bigquery.git']
# 2a. Set up GCS & BQ etc. using public terraform Docker image
- name: hashicorp/terraform
args: ['init']
dir: 'terraform'
# 2b. Create the GCS bucket using Terraform
- name: hashicorp/terraform
id: terraform-apply
args: ['apply', '-auto-approve']
dir: 'terraform'
# 3. Build and run the Dataflow pipeline (staged template)
- name: gcr.io/cloud-builders/gradle
args: ['build', 'run']
waitFor: ['terraform-apply']
# 4a. Install npm & run tests
- name: gcr.io/cloud-builders/npm
id: npm-install-test
args: ['install-test']
dir: 'cloud-function'
waitFor: ['terraform-apply']
# 4b. Deploy the Cloud Function that listens to the bucket
- name: gcr.io/cloud-builders/gcloud
id: function-deploy
args: ['functions', 'deploy', 'goWithTheDataFlow', '--stage-bucket=gs://batch-pipeline', '--trigger-bucket=gs://batch-pipeline']
dir: 'cloud-function'
waitFor: ['npm-install-test']
# 5. Trigger the pipeline for demo purposes
- name: gcr.io/cloud-builders/gsutil
args: ['cp', 'gs://test-file-for-dataflow/*', 'gs://batch-pipeline/upload/file.csv']
# 6. Copy tarball/archive to GCS for more shenanigans later
artifacts:
objects:
location: 'gs://batch-pipeline/artifacts'
paths: ['build/distributions/*.*']
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment