Skip to content

Instantly share code, notes, and snippets.

@athompso
Forked from starkers/prune_old_vols.sh
Last active October 23, 2015 15:23
Show Gist options
  • Save athompso/26f7752288511ed6db9b to your computer and use it in GitHub Desktop.
Save athompso/26f7752288511ed6db9b to your computer and use it in GitHub Desktop.
prune delete and remove old bacula volumes
#!/bin/sh
#set -x
# Prune and volumes older than 2 months:
PAST=$(date --date='-2 months 0 day ago' +%Y-%m-%d)
#EG a hard coded date
#PAST="2012-11-29"
#PAST="2013-11-29"
DB=bareos
#tmpfiles
TMPF1=$(mktemp)
TMPF2=$(mktemp)
#extract stuff from postgresq
su - postgres -c "psql --dbname=\"$DB\" --no-align --tuples-only --command=\"select volumename from media where lastwritten < '$PAST'\"" > $TMPF2
#how many volumes did we match btw?
COUNT=$(wc -l < $TMPF2)
if [ "$COUNT" -gt 0 ]; then
echo "Found $COUNT volumes"
for a in $(cat "$TMPF2") ; do
echo "### processing $a ###"
: > $TMPF1
printf "prune yes volume=$a\n" >> $TMPF1
printf "delete yes volume=$a\n" >> $TMPF1
#bconsole <$TMPF1 1>/dev/null
# ^ uncomment the above to go live, check this works first!
#check if bconsole exited with an error
if [ "$?" -gt 0 ]; then
echo "+[$a]: WARNING: something went wrong with bconsole: $?"
echo "See: TMPF1=$TMPF1 / TMPF2=$TMPF2"
exit
fi
# The above takes care of the bacula side.. now delete files
# as I store my files in a per-host directory (and keep catalog files elsewhere I just split this into two:
#Firstly lets remove the Catalog files
if grep -q ^Catalog_File <<< "$a" ; then
FPATH=/mnt/snas1/backups/new
echo "+[$a] removing file: $FPATH/$a"
echo rm -f "$FPATH/$a"
# rm -f "$FPATH/$a"
# ^ uncomment the above to go live
# else its a client..
else
# get the clients name from $a
NAME="$(cut -d "-" -f 1 <<<"$a")"
FPATH=/mnt/snas1/backups/new/pools
echo "+[$a] removing file: $FPATH/$NAME/$a"
echo rm -f "$FPATH/$NAME/$a"
# rm -f "$FPATH/$NAME/$a"
# ^ uncomment the above to go live
fi
echo
done
else
echo "No volumes found older than $PAST"
fi
#cleanup
rm -f "$TMPF1" "$TMPF2"
## Now as you may note the log table can get quite large.
# bacula does not (internally) rely on this in any form.. rather the "file" table is the important one
#
#
# bacula=# SELECT
#bacula-# relname AS objectname,
#bacula-# relkind AS objecttype,
#bacula-# reltuples AS "#entries", pg_size_pretty(relpages::bigint*8*1024) AS size
#bacula-# FROM pg_class
#bacula-# WHERE relpages >= 8
#bacula-# ORDER BY relpages DESC;
# objectname | objecttype | #entries | size
#---------------------------------+------------+-------------+---------
# log | r | 2.18772e+06 | 78 GB
# file | r | 3.49857e+08 | 51 GB
# file_jpfid_idx | i | 3.49857e+08 | 24 GB
# file_jobid_idx | i | 3.49857e+08 | 13 GB
# file_pkey | i | 3.49857e+08 | 11 GB
# log_name_idx | i | 2.18772e+06 | 7681 MB
# I suggest you truncate the bastard
# psql $DB -c "TRUNCATE log;"
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment