Skip to content

Instantly share code, notes, and snippets.

@mbaitelman
Last active February 20, 2024 22:31
Show Gist options
  • Star 41 You must be signed in to star a gist
  • Fork 20 You must be signed in to fork a gist
  • Save mbaitelman/ab5bb06847fb6e8756a22a47335aab78 to your computer and use it in GitHub Desktop.
Save mbaitelman/ab5bb06847fb6e8756a22a47335aab78 to your computer and use it in GitHub Desktop.
Automated Terraform Deployments Using Bitbucket Pipelines

Automated Terraform Deployments

Requirements

  • AWS Account
  • AWS API User API Key/Secret
  • Bitbucket Account

Setup

  1. Create or update an existing Bitbucket repo
  2. Open https://bitbucket.org/USERNAME/REPO/addon/pipelines/deployments
  3. Set your bitbucket-pipelines.yml with the content of the attached file
  4. Commit file
  5. Open your repo settings > Repository Variables
  6. Create AWS_ACCESS_KEY_ID with your AWS Api Key (mark as secret)
  7. Create AWS_SECRET_ACCESS_KEY with your AWS Secret Api Key (mark as secret)
  8. Create AWS_DEFAULT_REGION with an AWS Region (optinal if not set in your AWS provider file)
  9. Create and commit a file called aws.tf with the content of the attached file
    • For more information about the Terraform AWS provider check here
  10. Create an S3 bucket (should be private, enable versioning)
  11. Create a dynamo DB table
  12. Create and commit a file called terraform.tf with the content of the attached file
    • Replace BUCKET-NAME with your bucket name
    • Replace DYNAMODB-NAME with your dynamo DB table name
    • For more information about the AWS Terraform backend check here
  13. You are now ready to start deploying

Deploying

To use this for deployments:

  • Create a file with Terraform AWS resources.
  • Commit to a development branch
  • The pipeline will test to make sure the code is valid and show a plan of what will be created/updated/removed
  • If the pipeline passes, merge the changes to the master branch
  • The pipeline will again test and plan the steps before going ahead and deploying the new code
provider "aws" {
}
image: hashicorp/terraform:full
pipelines:
default:
- step:
script:
- terraform init
- terraform validate
- terraform plan
branches:
master:
- step:
script:
- terraform init
- terraform validate
- terraform plan
- terraform apply -input=false -auto-approve
terraform {
required_version = ">= 0.9.1"
backend "s3" {
# This is an s3bucket you will need to create in your aws
# space
bucket = "BUCKET-NAME"
# The key should be unique to each stack, because we want to
# have multiple enviornments alongside each other we set
# this dynamically in the bitbucket-pipelines.yml with the
# --backend
key = "example-01"
region = "us-west-2"
# This is a DynamoDB table with the Primary Key set to LockID
dynamodb_table = "DYNAMODB-NAME"
#Enable server side encryption on your terraform state
encrypt = true
}
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment