Skip to content

Instantly share code, notes, and snippets.

@Diewy
Created December 29, 2023 15:00
Show Gist options
  • Save Diewy/d400b58b4d48bade8c264fd466fab374 to your computer and use it in GitHub Desktop.
Save Diewy/d400b58b4d48bade8c264fd466fab374 to your computer and use it in GitHub Desktop.
Simple backup script for PostgreSQL and sync the backup to S3. In this example it is used to backup a Mastodon server.
#!/bin/bash
# Replace the placeholders with your actual database credentials
# AWS should be authenticated via the aws cli command `aws configure`
DATABASE_USER=mastodon
DATABASE_PASSWORD=
DATABASE_HOST=/var/run/postgresql
DATABASE_PORT=5432
DATABASE_NAME=mastodon_production
S3_BUCKET=mybucket
BACKUP_DIR=/home/mastodon/backups
# Create a new directory for the backup if it doesn't exist
mkdir -p $BACKUP_DIR
# Generate the current timestamp
TIMESTAMP=$(date +"%Y%m%d%H%M%S")
# Perform the backup using pg_dump
pg_dump -U $DATABASE_USER -W $DATABASE_PASSWORD -h $DATABASE_HOST -p $DATABASE_PORT $DATABASE_NAME > $BACKUP_DIR/$DATABASE_NAME-$TIMESTAMP.sql
# Upload the backup file to the S3 bucket
aws s3 cp $BACKUP_DIR/$DATABASE_NAME-$TIMESTAMP.sql s3://$S3_BUCKET/
# Delete old backups to keep only the 20 most recent backups
ls -t $BACKUP_DIR/*.sql | tail -n +21 | xargs rm --
# Keep only the 20 most recent backups in the S3 bucket
aws s3api list-objects --bucket $S3_BUCKET --query 'Contents[?Key!=`null`].[Key,LastModified]' --output text | sort -k2 | awk 'NR>20 {print $1}' | xargs -I {} aws s3 rm s3://$S3_BUCKET/{}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment