public
Last active

MySQL Cron Backup: Can be placed in the /etc/cron.hourly folder (make sure file is executable) to take a MySQL backup and dump to a file

  • Download Gist
gistfile1.sh
Shell
1 2 3 4 5 6
cd /home/bret/db_backups
DATE=$(date +"%Y%m%d%H%M")
# Export the MySQL database
mysqldump -u backup_user -p secret_password --all-databases | gzip --rsyncable > backup_db_$DATE.sql.gz
# Keep at most 30 files
ls -t | sed -e '1,30d' | xargs -d '\n' rm

Jamie from the Cloud Backup team here. It's highly recommended that you use the --rsyncable option on gzip if you wish for the DB dumps to be compressed on disk. The reason? Cloud Backup performs block-level de-duplication and gzip will actually change the entire remainder of a file if a single bit changes and this ends up defeating the ability of the agent to perform de-duplication. If you simply pass the --rsyncable option it enables some magic in gzip that allows those archives to be rsynced efficiently. This has a nice side effect of allowing the Cloud Backup agent de-duplicate across successive backups. I would have sent you a pull request, but that's apparently not possible with a gist.

If disk space is not a concern you will get the most efficient backups if you do not compress the dumps at all. The backup agent actually performs zlib (gzip) compression of data as it is backed up.

Jamie, thanks for the tip! I updated the gist.

Please sign in to comment on this gist.

Something went wrong with that request. Please try again.