Skip to content

Instantly share code, notes, and snippets.

@iGhost
Last active August 24, 2022 10:20
Show Gist options
  • Save iGhost/09fce031de1301b367e216809b4dd0d5 to your computer and use it in GitHub Desktop.
Save iGhost/09fce031de1301b367e216809b4dd0d5 to your computer and use it in GitHub Desktop.
PostgreSQL multi database backup script with uploading to AWS S3. Compress using awesome `zstd` tool. This script will backup all available databases on server into separate archive files and upload them to specified bucket in AWS S3
#!/bin/bash
# Installation
#
# from `ubuntu` user:
#
# * install Zstandard compression utility (near 3 times faster than gzip): `sudo apt install zstd`
#
# * install `awscli`: pip install awscli --upgrade --user
#
# * create `~/my-aws-config` file (0600) with AWS credentials
# [default]
# aws_access_key_id = XXXXXXXXXX
# aws_secret_access_key = YYYYYYYYYYYYY
#
# or specify credentials in default location (~/.aws/credentials)
#
# * create `~/backup` folder
#
# in script below specify:
# STAGE (staging|production|etc) - environment. the folder would be created in bucket
# BUCKET - AWS S3 bucket to store archives to
#
# * create cron job:
# `crontab -e`
# 0 8 * * * /home/ubuntu/backup.sh >> /home/ubuntu/cron.log 2>&1
export AWS_CONFIG_FILE='~/my-aws-config'
STAGE='production'
BUCKET='my-backups'
echo "========================================================================="
echo "`date '+%Y-%m-%d %H:%M:%S'`: PostgreSQL databases backup started";
echo "";
DBLIST=`sudo -u postgres psql -q -t -c "SELECT datname from pg_database WHERE NOT datistemplate AND datname <> 'postgres'"`;
for d in $DBLIST
do
DATE=`date '+%Y-%m-%d-%H-%M-%S'`;
SQLFILE="$d-$DATE.sql";
ZIPFILE=~/backup/$SQLFILE.zst;
echo "`date '+%Y-%m-%d %H:%M:%S'`: Dumping $d to $ZIPFILE";
sudo -u postgres pg_dump $d | zstd --rm > $ZIPFILE;
echo "`date '+%Y-%m-%d %H:%M:%S'`: Uploading to S3";
/home/ubuntu/.local/bin/aws s3 cp $ZIPFILE s3://$BUCKET/$STAGE/$d/
echo "";
done
echo "`date '+%Y-%m-%d %H:%M:%S'`: Cleaning up folder";
find ~/backup -mtime +10 -name "*.gz" -exec rm {} \;
echo "`date '+%Y-%m-%d %H:%M:%S'`: Backup finished";
@catermelon
Copy link

thank you! i swiped and edited it for my own backup <3

@mamged
Copy link

mamged commented Dec 16, 2021

you didn't mention the final command with the requirred arguments

@iGhost
Copy link
Author

iGhost commented Dec 16, 2021

@mamged On 26th line is shown how to run it using crontab.
But you may also just run it as is: bash backup.sh or just ./backup.sh if you specify attribute to exec it chmod +x backup.sh

@PenzinAlexander
Copy link

PenzinAlexander commented Aug 24, 2022

Hello. I was getting database names with $'\r'/ on the end.
So here's my solution for this:

#!/bin/bash
#set -x
function dump () {
  docker exec -it postgresql pg_dump -h 127.0.0.1 -U postgres -Fc -C "$db" > "/backup/$db.sql"
}

DBLIST2=$(docker exec -itpostgresql psql -U postgres -tc "SELECT datname from pg_database WHERE datistemplate = false AND datname <> 'postgres'; ")
DBLIST=`echo $DBLIST2 | sed 's/\\r//g'` #deleting those carriage return symbols
for db in $DBLIST
do
  echo "Dumping $db.."
  dump
done

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment