Skip to content

Instantly share code, notes, and snippets.

@ditatompel
Created January 17, 2023 21:56
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save ditatompel/dc1d13259df3b945a633f8c0b789bd80 to your computer and use it in GitHub Desktop.
Save ditatompel/dc1d13259df3b945a633f8c0b789bd80 to your computer and use it in GitHub Desktop.
Simple bash script to automatic backup multiple Ghost blog on the same server to remote AWS S3 compatible server.
#!/bin/bash
# Backup ghost website(s) to Minio
# https://rtd.ditatompel.com/ghost-multi-blog-backup-bash-script-to-minio-s3-compatible/
# Inspired from https://jerrynsh.com/backing-up-ghost-blog-in-5-steps/
#
# This script also need Minio CLI configured, see:
# https://min.io/docs/minio/linux/reference/minio-mc.html
# Or edit and adapt with your favorite s3 client on
# S3_SECTION below.
set -e
MINIO_REMOTE_ALIAS="myminio" # your mc `alias` name
MINIO_BUCKET="backups"
MINIO_FOLDER="ghost/" # Mandatory, don't forget the trailing slash at the end
# Array of website, `IFS` property separate by `|`
# `IFS[0]` = website shortname, used to organize backuo folder location on S3
# `IFS[1]` = Ghost website directory
GHOST_WEBSITES=(
"example_blog1|/path/to/blog1" # 1st website
"example_blog2|/path/to/blog2" # 2nd website
)
##### End basic config #####
SCRIPT_DIR=$( cd -- "$( dirname -- "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )
for WEBSITE in "${GHOST_WEBSITES[@]}"
do
IFS='|' read -ra WEBPARAMS <<< "$WEBSITE"
if [ ! -d "${WEBPARAMS[1]}" ]; then
echo "Folder not exists.. Skipping ${WEBPARAMS[0]}"
else
BACKUPDATE=`date +%Y-%m-%d-%H-%M`
echo "Performing backup ${WEBPARAMS[0]}"
cd ${WEBPARAMS[1]}
### ARCHIVE ###
tar -czf $SCRIPT_DIR/$BACKUPDATE-${WEBPARAMS[0]}.tar.gz content/ config.production.json package-lock.json
### DATABASE SECTION ###
db_user=$(ghost config get database.connection.user | tail -n1)
db_pass=$(ghost config get database.connection.password | tail -n1)
db_name=$(ghost config get database.connection.database | tail -n1)
mysqldump -u"$db_user" -p"$db_pass" "$db_name" --no-tablespaces | gzip > "$SCRIPT_DIR/$BACKUPDATE-$db_name.sql.gz"
### S3_SECTION ###
# adapt to your env
mc cp $SCRIPT_DIR/$BACKUPDATE-${WEBPARAMS[0]}.tar.gz $MINIO_REMOTE_ALIAS/$MINIO_BUCKET/$MINIO_FOLDER${WEBPARAMS[0]}/$(date +%Y)/$(date +%m)/$BACKUPDATE-${WEBPARAMS[0]}.tar.gz
mc cp $SCRIPT_DIR/$BACKUPDATE-$db_name.sql.gz $MINIO_REMOTE_ALIAS/$MINIO_BUCKET/$MINIO_FOLDER${WEBPARAMS[0]}/$(date +%Y)/$(date +%m)/$BACKUPDATE-$db_name.sql.gz
# REMOVE LOCAL BACKUP
rm -f $SCRIPT_DIR/$BACKUPDATE-${WEBPARAMS[0]}.tar.gz
rm -f $SCRIPT_DIR/$BACKUPDATE-$db_name.sql.gz
cd $SCRIPT_DIR
fi
done
exit 0
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment