Skip to content

Instantly share code, notes, and snippets.

@SagnikPradhan
Last active October 27, 2023 12:51
Show Gist options
  • Save SagnikPradhan/d3825025da670bf78fff5a6455cb4c00 to your computer and use it in GitHub Desktop.
Save SagnikPradhan/d3825025da670bf78fff5a6455cb4c00 to your computer and use it in GitHub Desktop.
๐Ÿ—„๏ธ Back up a postgres database running inside a docker container to S3
#!/bin/bash
set -e
set -o pipefail
GENERATION_DATE=$(date -u +"%Y-%m-%dT%H-%M-%SZ")
GENERATED_BACKUP_FILE="backup-$GENERATION_DATE.dump"
if [ -e .env ]; then
read -ra env < <(xargs < .env)
export "${env[@]}"
fi
[[ -z "$DATABASE_CONTAINER" ]] && { errecho "DATABASE_CONTAINER is empty" ; exit 1; }
[[ -z "$DATABASE_URL" ]] && { errecho "DATABASE_URL is empty" ; exit 1; }
[[ -z "$AWS_ACCESS_KEY_ID" ]] && { errecho "AWS_ACCESS_KEY_ID is empty" ; exit 1; }
[[ -z "$AWS_SECRET_ACCESS_KEY" ]] && { errecho "AWS_SECRET_ACCESS_KEY is empty" ; exit 1; }
[[ -z "$AWS_DEFAULT_REGION" ]] && { errecho "AWS_DEFAULT_REGION is empty" ; exit 1; }
[[ -z "$AWS_BUCKET_NAME" ]] && { errecho "AWS_BUCKET_NAME is empty" ; exit 1; }
[[ -z "$AWS_BACKUP_FOLDER" ]] && { errecho "AWS_BACKUP_FOLDER is empty" ; exit 1; }
docker exec "$DATABASE_CONTAINER" pg_dump -Fc --dbname="$DATABASE_URL" |
gzip - --best |
docker run \
--rm -i \
--env AWS_ACCESS_KEY_ID \
--env AWS_SECRET_ACCESS_KEY \
--env AWS_DEFAULT_REGION \
amazon/aws-cli s3 cp - "s3://$AWS_BUCKET_NAME/$AWS_BACKUP_FOLDER/$GENERATED_BACKUP_FILE.gz"
@SagnikPradhan
Copy link
Author

SagnikPradhan commented Oct 27, 2023

You might want to use this along with crontab and s3 expiration policies. Also don't forget to add in an .env file or pass them in explicitly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment