Skip to content

Instantly share code, notes, and snippets.

Embed
What would you like to do?
Auto PostgreSQL backup script.
#!/bin/bash
#
# PostgreSQL Backup Script Ver 1.0
# http://autopgsqlbackup.frozenpc.net
# Copyright (c) 2005 Aaron Axelsen <axelseaa@amadmax.com>
#
# This script is based of the AutoMySQLBackup Script Ver 2.2
# It can be found at http://sourceforge.net/projects/automysqlbackup/
#
# The PostgreSQL changes are based on a patch agaisnt AutoMySQLBackup 1.9
# created by Friedrich Lobenstock <fl@fl.priv.at>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
#
#=====================================================================
# Set the following variables to your system needs
# (Detailed instructions below variables)
#=====================================================================
# Username to access the PostgreSQL server e.g. dbuser
USERNAME=postgres
# Password
# create a file $HOME/.pgpass containing a line like this
# hostname:*:*:dbuser:dbpass
# replace hostname with the value of DBHOST and postgres with
# the value of USERNAME
# Host name (or IP address) of PostgreSQL server e.g localhost
DBHOST=localhost
# List of DBNAMES for Daily/Weekly Backup e.g. "DB1 DB2 DB3"
DBNAMES="all"
# Backup directory location e.g /backups
BACKUPDIR="/var/backups/postgres"
# Mail setup
# What would you like to be mailed to you?
# - log : send only log file
# - files : send log file and sql files as attachments (see docs)
# - stdout : will simply output the log to the screen if run manually.
MAILCONTENT="log"
# Set the maximum allowed email size in k. (4000 = approx 5MB email [see docs])
MAXATTSIZE="4000"
# Email Address to send mail to? (user@domain.com)
MAILADDR="root@localhost"
# ============================================================
# === ADVANCED OPTIONS ( Read the doc's below for details )===
#=============================================================
# List of DBBNAMES for Monthly Backups.
MDBNAMES="template1 $DBNAMES"
# List of DBNAMES to EXLUCDE if DBNAMES are set to all (must be in " quotes)
DBEXCLUDE=""
# Include CREATE DATABASE in backup?
CREATE_DATABASE=yes
# Separate backup directory and file for each DB? (yes or no)
SEPDIR=yes
# Which day do you want weekly backups? (1 to 7 where 1 is Monday)
DOWEEKLY=6
# Choose Compression type. (gzip or bzip2)
COMP=bzip2
# Command to run before backups (uncomment to use)
#PREBACKUP="/etc/pgsql-backup-pre"
# Command run after backups (uncomment to use)
#POSTBACKUP="bash /home/backups/scripts/ftp_pgsql"
#=====================================================================
# Options documentation
#=====================================================================
# Set USERNAME and PASSWORD of a user that has at least SELECT permission
# to ALL databases.
#
# Set the DBHOST option to the server you wish to backup, leave the
# default to backup "this server".(to backup multiple servers make
# copies of this file and set the options for that server)
#
# Put in the list of DBNAMES(Databases)to be backed up. If you would like
# to backup ALL DBs on the server set DBNAMES="all".(if set to "all" then
# any new DBs will automatically be backed up without needing to modify
# this backup script when a new DB is created).
#
# If the DB you want to backup has a space in the name replace the space
# with a % e.g. "data base" will become "data%base"
# NOTE: Spaces in DB names may not work correctly when SEPDIR=no.
#
# You can change the backup storage location from /backups to anything
# you like by using the BACKUPDIR setting..
#
# The MAILCONTENT and MAILADDR options and pretty self explanitory, use
# these to have the backup log mailed to you at any email address or multiple
# email addresses in a space seperated list.
# (If you set mail content to "log" you will require access to the "mail" program
# on your server. If you set this to "files" you will have to have mutt installed
# on your server. If you set it sto stdout it will log to the screen if run from
# the console or to the cron job owner if run through cron)
#
# MAXATTSIZE sets the largest allowed email attachments total (all backup files) you
# want the script to send. This is the size before it is encoded to be sent as an email
# so if your mail server will allow a maximum mail size of 5MB I would suggest setting
# MAXATTSIZE to be 25% smaller than that so a setting of 4000 would probably be fine.
#
# Finally copy automysqlbackup.sh to anywhere on your server and make sure
# to set executable permission. You can also copy the script to
# /etc/cron.daily to have it execute automatically every night or simply
# place a symlink in /etc/cron.daily to the file if you wish to keep it
# somwhere else.
# NOTE:On Debian copy the file with no extention for it to be run
# by cron e.g just name the file "automysqlbackup"
#
# Thats it..
#
#
# === Advanced options doc's ===
#
# The list of MDBNAMES is the DB's to be backed up only monthly. You should
# always include "mysql" in this list to backup your user/password
# information along with any other DBs that you only feel need to
# be backed up monthly. (if using a hosted server then you should
# probably remove "mysql" as your provider will be backing this up)
# NOTE: If DBNAMES="all" then MDBNAMES has no effect as all DBs will be backed
# up anyway.
#
# If you set DBNAMES="all" you can configure the option DBEXCLUDE. Other
# wise this option will not be used.
# This option can be used if you want to backup all dbs, but you want
# exclude some of them. (eg. a db is to big).
#
# Set CREATE_DATABASE to "yes" (the default) if you want your SQL-Dump to create
# a database with the same name as the original database when restoring.
# Saying "no" here will allow your to specify the database name you want to
# restore your dump into, making a copy of the database by using the dump
# created with automysqlbackup.
# NOTE: Not used if SEPDIR=no
#
# The SEPDIR option allows you to choose to have all DBs backed up to
# a single file (fast restore of entire server in case of crash) or to
# seperate directories for each DB (each DB can be restored seperately
# in case of single DB corruption or loss).
#
# To set the day of the week that you would like the weekly backup to happen
# set the DOWEEKLY setting, this can be a value from 1 to 7 where 1 is Monday,
# The default is 6 which means that weekly backups are done on a Saturday.
#
# COMP is used to choose the copmression used, options are gzip or bzip2.
# bzip2 will produce slightly smaller files but is more processor intensive so
# may take longer to complete.
#
# Use PREBACKUP and POSTBACKUP to specify Per and Post backup commands
# or scripts to perform tasks either before or after the backup process.
#
#
#=====================================================================
# Backup Rotation..
#=====================================================================
#
# Daily Backups are rotated weekly..
# Weekly Backups are run by default on Saturday Morning when
# cron.daily scripts are run...Can be changed with DOWEEKLY setting..
# Weekly Backups are rotated on a 5 week cycle..
# Monthly Backups are run on the 1st of the month..
# Monthly Backups are NOT rotated automatically...
# It may be a good idea to copy Monthly backups offline or to another
# server..
#
#=====================================================================
# Please Note!!
#=====================================================================
#
# I take no resposibility for any data loss or corruption when using
# this script..
# This script will not help in the event of a hard drive crash. If a
# copy of the backup has not be stored offline or on another PC..
# You should copy your backups offline regularly for best protection.
#
# Happy backing up...
#
#=====================================================================
# Change Log
#=====================================================================
#
# VER 1.0 - (2005-03-25)
# Initial Release - based on AutoMySQLBackup 2.2
#
#=====================================================================
#=====================================================================
#
# Should not need to be modified from here down!!
#
#=====================================================================
#=====================================================================
PATH=/usr/local/bin:/usr/bin:/bin:/usr/local/postgres/bin:/usr/local/pgsql/bin
DATE=`date +%Y-%m-%d` # Datestamp e.g 2002-09-21
DOW=`date +%A` # Day of the week e.g. Monday
DNOW=`date +%u` # Day number of the week 1 to 7 where 1 represents Monday
DOM=`date +%d` # Date of the Month e.g. 27
M=`date +%B` # Month e.g January
W=`date +%V` # Week Number e.g 37
VER=1.0 # Version Number
LOGFILE=$BACKUPDIR/$DBHOST-`date +%N`.log # Logfile Name
OPT="" # OPT string for use with mysqldump ( see man mysqldump )
BACKUPFILES="" # thh: added for later mailing
# Create required directories
if [ ! -e "$BACKUPDIR" ] # Check Backup Directory exists.
then
mkdir -p "$BACKUPDIR"
fi
if [ ! -e "$BACKUPDIR/daily" ] # Check Daily Directory exists.
then
mkdir -p "$BACKUPDIR/daily"
fi
if [ ! -e "$BACKUPDIR/weekly" ] # Check Weekly Directory exists.
then
mkdir -p "$BACKUPDIR/weekly"
fi
if [ ! -e "$BACKUPDIR/monthly" ] # Check Monthly Directory exists.
then
mkdir -p "$BACKUPDIR/monthly"
fi
# IO redirection for logging.
touch $LOGFILE
exec 6>&1 # Link file descriptor #6 with stdout.
# Saves stdout.
exec > $LOGFILE # stdout replaced with file $LOGFILE.
# Functions
# Database dump function
dbdump () {
pg_dump --username=$USERNAME $HOST $OPT $1 > $2
return 0
}
# Compression function
SUFFIX=""
compression () {
if [ "$COMP" = "gzip" ]; then
gzip -f "$1"
echo
echo Backup Information for "$1"
gzip -l "$1.gz"
SUFFIX=".gz"
elif [ "$COMP" = "bzip2" ]; then
echo Compression information for "$1.bz2"
bzip2 -f -v $1 2>&1
SUFFIX=".bz2"
else
echo "No compression option set, check advanced settings"
fi
return 0
}
# Run command before we begin
if [ "$PREBACKUP" ]
then
echo ======================================================================
echo "Prebackup command output."
echo
eval $PREBACKUP
echo
echo ======================================================================
echo
fi
if [ "$SEPDIR" = "yes" ]; then # Check if CREATE DATABSE should be included in Dump
if [ "$CREATE_DATABASE" = "no" ]; then
OPT="$OPT"
else
OPT="$OPT --create"
fi
else
OPT="$OPT"
fi
# Hostname for LOG information
if [ "$DBHOST" = "localhost" ]; then
DBHOST="`hostname -f`"
HOST=""
else
HOST="-h $DBHOST"
fi
# If backing up all DBs on the server
if [ "$DBNAMES" = "all" ]; then
DBNAMES="`psql -U $USERNAME $HOST -l -A -F: | sed -ne "/:/ { /Name:Owner/d; /template0/d; s/:.*$//; p }"`"
# If DBs are excluded
for exclude in $DBEXCLUDE
do
DBNAMES=`echo $DBNAMES | sed "s/\b$exclude\b//g"`
done
MDBNAMES=$DBNAMES
fi
echo ======================================================================
echo AutoPostgreSQLBackup VER $VER
echo http://autopgsqlbackup.frozenpc.net/
echo
echo Backup of Database Server - $DBHOST
echo ======================================================================
# Test is seperate DB backups are required
if [ "$SEPDIR" = "yes" ]; then
echo Backup Start Time `date`
echo ======================================================================
# Monthly Full Backup of all Databases
if [ $DOM = "01" ]; then
for MDB in $MDBNAMES
do
# Prepare $DB for using
MDB="`echo $MDB | sed 's/%/ /g'`"
if [ ! -e "$BACKUPDIR/monthly/$MDB" ] # Check Monthly DB Directory exists.
then
mkdir -p "$BACKUPDIR/monthly/$MDB"
fi
echo Monthly Backup of $MDB...
dbdump "$MDB" "$BACKUPDIR/monthly/$MDB/${MDB}_$DATE.$M.$MDB.sql"
compression "$BACKUPDIR/monthly/$MDB/${MDB}_$DATE.$M.$MDB.sql"
BACKUPFILES="$BACKUPFILES $BACKUPDIR/monthly/$MDB/${MDB}_$DATE.$M.$MDB.sql$SUFFIX"
echo ----------------------------------------------------------------------
done
fi
for DB in $DBNAMES
do
# Prepare $DB for using
DB="`echo $DB | sed 's/%/ /g'`"
# Create Separate directory for each DB
if [ ! -e "$BACKUPDIR/daily/$DB" ] # Check Daily DB Directory exists.
then
mkdir -p "$BACKUPDIR/daily/$DB"
fi
if [ ! -e "$BACKUPDIR/weekly/$DB" ] # Check Weekly DB Directory exists.
then
mkdir -p "$BACKUPDIR/weekly/$DB"
fi
# Weekly Backup
if [ $DNOW = $DOWEEKLY ]; then
echo Weekly Backup of Database \( $DB \)
echo Rotating 5 weeks Backups...
if [ "$W" -le 05 ];then
REMW=`expr 48 + $W`
elif [ "$W" -lt 15 ];then
REMW=0`expr $W - 5`
else
REMW=`expr $W - 5`
fi
eval rm -fv "$BACKUPDIR/weekly/$DB/week.$REMW.*"
echo
dbdump "$DB" "$BACKUPDIR/weekly/$DB/${DB}_week.$W.$DATE.sql"
compression "$BACKUPDIR/weekly/$DB/${DB}_week.$W.$DATE.sql"
BACKUPFILES="$BACKUPFILES $BACKUPDIR/weekly/$DB/${DB}_week.$W.$DATE.sql$SUFFIX"
echo ----------------------------------------------------------------------
# Daily Backup
else
echo Daily Backup of Database \( $DB \)
echo Rotating last weeks Backup...
eval rm -fv "$BACKUPDIR/daily/$DB/*.$DOW.sql.*"
echo
dbdump "$DB" "$BACKUPDIR/daily/$DB/${DB}_$DATE.$DOW.sql"
compression "$BACKUPDIR/daily/$DB/${DB}_$DATE.$DOW.sql"
BACKUPFILES="$BACKUPFILES $BACKUPDIR/daily/$DB/${DB}_$DATE.$DOW.sql$SUFFIX"
echo ----------------------------------------------------------------------
fi
done
echo Backup End `date`
echo ======================================================================
else # One backup file for all DBs
echo Backup Start `date`
echo ======================================================================
# Monthly Full Backup of all Databases
if [ $DOM = "01" ]; then
echo Monthly full Backup of \( $MDBNAMES \)...
dbdump "$MDBNAMES" "$BACKUPDIR/monthly/$DATE.$M.all-databases.sql"
compression "$BACKUPDIR/monthly/$DATE.$M.all-databases.sql"
BACKUPFILES="$BACKUPFILES $BACKUPDIR/monthly/$DATE.$M.all-databases.sql$SUFFIX"
echo ----------------------------------------------------------------------
fi
# Weekly Backup
if [ $DNOW = $DOWEEKLY ]; then
echo Weekly Backup of Databases \( $DBNAMES \)
echo
echo Rotating 5 weeks Backups...
if [ "$W" -le 05 ];then
REMW=`expr 48 + $W`
elif [ "$W" -lt 15 ];then
REMW=0`expr $W - 5`
else
REMW=`expr $W - 5`
fi
eval rm -fv "$BACKUPDIR/weekly/week.$REMW.*"
echo
dbdump "$DBNAMES" "$BACKUPDIR/weekly/week.$W.$DATE.sql"
compression "$BACKUPDIR/weekly/week.$W.$DATE.sql"
BACKUPFILES="$BACKUPFILES $BACKUPDIR/weekly/week.$W.$DATE.sql$SUFFIX"
echo ----------------------------------------------------------------------
# Daily Backup
else
echo Daily Backup of Databases \( $DBNAMES \)
echo
echo Rotating last weeks Backup...
eval rm -fv "$BACKUPDIR/daily/*.$DOW.sql.*"
echo
dbdump "$DBNAMES" "$BACKUPDIR/daily/$DATE.$DOW.sql"
compression "$BACKUPDIR/daily/$DATE.$DOW.sql"
BACKUPFILES="$BACKUPFILES $BACKUPDIR/daily/$DATE.$DOW.sql$SUFFIX"
echo ----------------------------------------------------------------------
fi
echo Backup End Time `date`
echo ======================================================================
fi
echo Total disk space used for backup storage..
echo Size - Location
echo `du -hs "$BACKUPDIR"`
echo
# Run command when we're done
if [ "$POSTBACKUP" ]
then
echo ======================================================================
echo "Postbackup command output."
echo
eval $POSTBACKUP
echo
echo ======================================================================
fi
#Clean up IO redirection
exec 1>&6 6>&- # Restore stdout and close file descriptor #6.
if [ "$MAILCONTENT" = "files" ]
then
#Get backup size
ATTSIZE=`du -c $BACKUPFILES | grep "[[:digit:][:space:]]total$" |sed s/\s*total//`
if [ $MAXATTSIZE -ge $ATTSIZE ]
then
BACKUPFILES=`echo "$BACKUPFILES" | sed -e "s# # -a #g"` #enable multiple attachments
mutt -s "PostgreSQL Backup Log and SQL Files for $DBHOST - $DATE" $BACKUPFILES $MAILADDR < $LOGFILE #send via mutt
else
cat "$LOGFILE" | mail -s "WARNING! - PostgreSQL Backup exceeds set maximum attachment size on $HOST - $DATE" $MAILADDR
fi
elif [ "$MAILCONTENT" = "log" ]
then
cat "$LOGFILE" | mail -s "PostgreSQL Backup Log for $DBHOST - $DATE" $MAILADDR
else
cat "$LOGFILE"
fi
# Clean up Logfile
eval rm -f "$LOGFILE"
exit 0
@guapolo

This comment has been minimized.

Copy link

@guapolo guapolo commented Oct 18, 2013

Nice work. Very useful :)

@tejastank

This comment has been minimized.

Copy link

@tejastank tejastank commented Sep 5, 2014

If Want to backup only latest not other , how todo!

@p1gmale0n

This comment has been minimized.

Copy link

@p1gmale0n p1gmale0n commented Nov 18, 2014

Weekly (separated) backups are not deleted.

@@ -384,7 +384,7 @@ echo ======================================================================
                        else
                                REMW=`expr $W - 5`
                        fi
-               eval rm -fv "$BACKUPDIR/weekly/$DB/${DB}_week.$REMW.*"
+               eval rm -fv "$BACKUPDIR/weekly/$DB/week.$REMW.*"
                echo
                        dbdump "$DB" "$BACKUPDIR/weekly/$DB/${DB}_week.$W.$DATE.sql"
                        compression "$BACKUPDIR/weekly/$DB/${DB}_week.$W.$DATE.sql"
@ruurd

This comment has been minimized.

Copy link

@ruurd ruurd commented Feb 28, 2015

I adjusted the retrieval of DBNAMES because sed played tricks on OS X:

DBNAMES=`psql postgres -t -c "select datname from pg_database;"`
@atularvind

This comment has been minimized.

Copy link

@atularvind atularvind commented Nov 26, 2015

Nice Script.

@jeroenp

This comment has been minimized.

Copy link

@jeroenp jeroenp commented Dec 3, 2015

psql -t -P format=unaligned -c "select datname from pg_database" postgres

obviates the use of sed.

@pascalandy

This comment has been minimized.

Copy link

@pascalandy pascalandy commented Apr 25, 2016

@atularvind , @jeroenp

Have you guys a final script to share with your updates ?

Thank you!

@MrOlek

This comment has been minimized.

Copy link

@MrOlek MrOlek commented May 5, 2016

Nice script, but as for me it is much easier use this free tool http://postgresql-backup.com/

@skolesnyk

This comment has been minimized.

Copy link

@skolesnyk skolesnyk commented May 5, 2016

@MrOlek, you're pushing windows software, doesn't help with linux :)

@kyeljmd

This comment has been minimized.

Copy link

@kyeljmd kyeljmd commented Jun 2, 2016

How do I execute/install this script on my server? do I simply just run this?

@wolfieorama

This comment has been minimized.

Copy link

@wolfieorama wolfieorama commented Jun 7, 2016

Awesome, Brilliant and Neat ... does the job ;) will blog about it

@wolfieorama

This comment has been minimized.

Copy link

@wolfieorama wolfieorama commented Jun 7, 2016

@kyeljmd yes , just copy it to your server change the necessary line (Users, dbnames ...etc) the just run it like normal bash script file this link should explain it https://help.ubuntu.com/community/Beginners/BashScripting#Intro

@abhayshukla01

This comment has been minimized.

Copy link

@abhayshukla01 abhayshukla01 commented Jun 15, 2016

Hi I need to develop an automated script to just copy and take backup of my particular database and not all databases. can someone help me with its script?

@deepakdeore2004

This comment has been minimized.

Copy link

@deepakdeore2004 deepakdeore2004 commented Jun 19, 2016

@abhayshukla01
Set DBNAMES variable to the specific DB you want.

# List of DBNAMES for Daily/Weekly Backup e.g. "DB1 DB2 DB3"
DBNAMES="all"

eg.

DBNAMES="myApp1" # for single DB
DBNAMES="myApp1 myApp2 myApp3" # for multiple DBs
@ameerdevs

This comment has been minimized.

Copy link

@ameerdevs ameerdevs commented Aug 3, 2016

pg_dump --username=$USERNAME $HOST $OPT $1 > $2
return 0

This can't be safe, some error checking could probably be added

@rxt360

This comment has been minimized.

Copy link

@rxt360 rxt360 commented Jan 12, 2017

Thanks for the script. Options to exclude daily, weekly and monthly backup would be nice, like DAILY=yes/no, and yes should be the default of course.

@gh67uyyghj

This comment has been minimized.

Copy link

@gh67uyyghj gh67uyyghj commented Feb 1, 2017

Gitlab brought me here

@Aleyasen

This comment has been minimized.

Copy link

@Aleyasen Aleyasen commented Feb 26, 2017

@matthewlehner thanks for the nice script. Just a question, the dbpass in $HOME/.pgpass should be the encrypted password or the plain password?

@tarunsinghaldotme

This comment has been minimized.

Copy link

@tarunsinghaldotme tarunsinghaldotme commented Aug 21, 2017

Hi @matthewlehner,
first of all thanks for sharing such useful script.
but i have some issue also with this script. it is working correctly if i ran this script locally but when i ran this script for remote pg server, it is making only file without any data.
and secondly there is no option to specify port number, which caused trouble if you have configured your pg server to different port number.

@wdmeeste1

This comment has been minimized.

Copy link

@wdmeeste1 wdmeeste1 commented Sep 5, 2017

Nice script but host port option should be added!

@victor-lund

This comment has been minimized.

Copy link

@victor-lund victor-lund commented Sep 12, 2017

I doubt this script is fail safe to run on production servers. @BracketDevs had some valid concerns. In addition to that, the script does not seem to be maintained.

However, I can't seem to find any solid alternatives out there (which is really strange?). A question about running this as a cron job: does anyone know how to avoid password prompting when running the script? I'm getting prompted for password for every database i have, which messes up cron jobs.

@alysson-azevedo

This comment has been minimized.

Copy link

@alysson-azevedo alysson-azevedo commented Sep 12, 2017

@victor-lund use linux socket connection to allow your root user to connect as postgres without password.

@ajmas

This comment has been minimized.

Copy link

@ajmas ajmas commented Sep 21, 2017

One enhancement could be support for peer authentication?

Edit: I made some changes in my own version, that appear to work for me (consider not fully tested). It assumes root is initiating the job, due to use of sudo.

In advanced options section:

# Whether to use peer authentication
PEER_AUTH=yes

In pg_dump function:

if [ "$PEER_AUTH" = "yes" ]; then
       echo sudo -u $USERNAME pg_dump $OPT $1 > $2
else
       pg_dump --username=$USERNAME $HOST $OPT $1 > $2
fi

In the section for 'all' databases:

 if [ "$PEER_AUTH" = "yes"] ; then
          DBNAMES="`sudo -u $USERNAME psql -l -A -F: | sed -ne "/:/ { /Name:Owner/d; /template0/d; s/:.*$//; p }"`"
 else
          DBNAMES="`psql -U $USERNAME $HOST -l -A -F: | sed -ne "/:/ { /Name:Owner/d; /template0/d; s/:.*$//; p }"`"
 fi
@maheshbabuadapa

This comment has been minimized.

Copy link

@maheshbabuadapa maheshbabuadapa commented Jan 5, 2018

Hello
Is there is any way to redirect log files to local system?

@Arti3DPlayer

This comment has been minimized.

Copy link

@Arti3DPlayer Arti3DPlayer commented Mar 27, 2018

WARNING: password file "/home/myuser/.pgpass" has group or world access; permissions should be u=rw (0600) or less
psql: FATAL:  Peer authentication failed for user "myuser"

I created .phpass file with db name and password and set this script in cron

@mr-katsini

This comment has been minimized.

Copy link

@mr-katsini mr-katsini commented May 22, 2018

This is awesome! Love it! Is there anyway to set the amount of backups to save?
Like a
MAX_WEEKLY=3 where this would store the latest three weekly database backups per DB backed up
MAX_DAILY=3 where this would store the latest three daily database backups per DB backed up
MAX_MONTHLY=3 where this would store the latest three monthly database backups per DB backed up

@IgorCova

This comment has been minimized.

Copy link

@IgorCova IgorCova commented Jul 18, 2018

Great script. But I don't understand how to restore DB from the created backup? It will be a another script?
And how to check that the backup health is good?

@hengkypl

This comment has been minimized.

Copy link

@hengkypl hengkypl commented Aug 28, 2018

Thank you for sharing this usefull script. Is it possible to backup from multiple host with this script ?

@suganya-sangith

This comment has been minimized.

Copy link

@suganya-sangith suganya-sangith commented Sep 19, 2018

Thank you for sharing the script. Is there any way to exclude some tables from the database while doing backup?

For example, I can exclude the tables in mysqlbackup using following config
CONFIG_table_exclude= ( database.table_name )

@shrirambalakrishnan

This comment has been minimized.

Copy link

@shrirambalakrishnan shrirambalakrishnan commented Oct 17, 2018

@suganya-sangith There is no separate config variable available for this.
The OPT variable need to be set to --exclude-table=table_name for this.

@matthieume

This comment has been minimized.

Copy link

@matthieume matthieume commented Jun 12, 2019

Thanks for the script.
My OS wasn't in English, I had to change the line 318 to add the option -t (do not display header) like this:
DBNAMES="psql -U $USERNAME $HOST -l -A -F: -t | sed -ne "/:/ { /Name:Owner/d; /template0/d; s/:.*$//; p }""

@mzpqnxow

This comment has been minimized.

Copy link

@mzpqnxow mzpqnxow commented Dec 16, 2020

You can make a few small changes to this script to use pixz instead of gzip or bzip2. The pixz compression utility has the following benefits:

  1. It can operate concurrently during compression operations, making use of an arbitrary number of CPUs. Depending on the concurrency level you specify, this can SIGNIFICANTLY improve the time of compression. Note that decompression time can not be done concurrently
  2. It has a better compression ratio than both gzip and bzip2, especially on ASCII-based files with default settings- an example ASCII psql file that is 3.2GB compressed to 198MB with default bzip2 settings, or 147MB with default pixz settings- a pretty big difference. YMMV depending on the entropy of your database data, but because it's mostly ASCII, it should compress really well, similar to the example case

To use pixz with this (very nice) script, you need two changes. Change the compression() function to look like this:

# Compression function
SUFFIX=""
compression () {
if [ "$COMP" = "gzip" ]; then
	gzip -f "$1"
	echo
	echo Backup Information for "$1"
	gzip -l "$1.gz"
	SUFFIX=".gz"
elif [ "$COMP" = "bzip2" ]; then
    echo Compression information for "$1.bz2"
    bzip2 -f -v $1 2>&1
    SUFFIX=".bz2"
elif [ "$COMP" = "pixz" ]; then
    echo Compression information for "$1.xz"
    pixz -p "$CONCURRENCY_LEVEL" $1 2>&1
    SUFFIX=".xz"
else
	echo "No compression option set, check advanced settings"
fi
return 0
}

You will also need to add the following at the top of the file, where the user variables are set:

# How many processors to use when compressing, only used when COMP=pixz
CONCURRENCY_LEVEL=n
# Choose Compression type. (gzip, bzip2 or pixz)
COMP=pixz

... Where n is the integer count of logical CPUs you'd like to use to perform the compute intensive compression operations. More is faster, obviously

WARNING

You obviously want to be carefully to not completely hose your machine by specifying a CONCURRENCY_LEVEL value that is too high. Generally speaking, you shouldn't ever go above n-1, where n is the total logical CPU count of the system. If the database server is running on the same machine, you'll want to be a bit more conservative than that or there will be a big impact on postgres performance while all of the CPUs are tied up. Use this modification at your own risk- it will speed up the compression (and I find that in general)

On Linux, you can get the count of logical cores using:

$ cat /proc/cpuinfo | grep '^processor\s:' | wc -l

EDIT: Thanks for this excellent script by the way, I really like how it lays things out for archival!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment