Skip to content

Instantly share code, notes, and snippets.

@dael
Last active August 29, 2015 14:08
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save dael/1d510be3b001560acca1 to your computer and use it in GitHub Desktop.
Save dael/1d510be3b001560acca1 to your computer and use it in GitHub Desktop.
#!/bin/bash
#start
#-----------------------------------------------------------------------
find /srv/backup/daily/databases/ -name '*.gz' -mtime +7 | xargs rm -f;
find /srv/backup/daily/websites/ -name '*.gz' -mtime +7 | xargs rm -f;
# Are Weekly Backups Implemented?
# find /srv/backup/weekly/ -name '*.gz' -mtime +30 | xargs rm -f;
#-----------------------------------------------------------------------
#end
# Example Schedule
# Remove Backups Greater than 7 Days Old Daily @ 01:00 AM
# 00 01 * * * root /etc/cron.daily/backup-cleanup
#!/bin/bash
#start
#-----------------------------------------------------------------------
#verify directory structure exists prior to running this job
BackUpDIR="/srv/backup/daily/databases/";
DateStamp=$(date +"%Y%m%d");
#Format of DBList="db1 db2 db3 db4"
DBList="[dbname]";
#I have a server system administrator account with access to all dbs, typically named sysadmin
DBUser="[user with db permissions like sysadmin]";
DBPwd="[user password]";
for DB in $DBList;
do
mysqldump --opt -u$DBUser -p$DBPwd --add-drop-table --lock-tables --databases $DB > $BackUpDIR$DateStamp.$DB.sql;
tar zcf "$BackUpDIR$DateStamp.DB.$DB.tar.gz" -P $BackUpDIR$DateStamp.$DB.sql;
rm -rf $BackUpDIR$DateStamp.$DB.sql;
mysqldump --opt -u$DBUser -p$DBPwd --add-drop-table --lock-tables $DB > $BackUpDIR$DateStamp.$DB.tbls.sql;
tar zcf "$BackUpDIR$DateStamp.DB.$DB.tbls.tar.gz" -P $BackUpDIR$DateStamp.$DB.tbls.sql;
rm -rf $BackUpDIR$DateStamp.$DB.tbls.sql;
done
#-----------------------------------------------------------------------
#end
# Example Schedule
# Backup Databases Daily @ 12:30 AM
# 30 00 * * * root /etc/cron.daily/backup-databases
#!/bin/bash
#start
#-----------------------------------------------------------------------
#verify directory structure exists prior to running this job
BackUpDIR="/srv/backup/daily/websites/";
SrvDir="/srv/www/";
#Format of SiteList="sitefolder1 sitefolder2 sitefolder3"
SiteList="[sitefolders]";
DateStamp=$(date +"%Y%m%d");
for Site in $SiteList;
do
#backup all files, however, exclude the rackspace cloud cdn if you are using it
tar zcf "$BackUpDIR$DateStamp.website.code.$Site.tar.gz" -P $SrvDir$Site --exclude "$SrvDir$Site/wp-content-cloudfiles";
done
#-----------------------------------------------------------------------
#end
# Example Schedule
# Backup Wordpress Site Files Daily @ 12:45 AM
# 45 00 * * * root /etc/cron.daily/backup-wordpress-site-files
# see "man logrotate" for details
# rotate log files weekly
daily
# keep 2 weeks worth of backlogs
rotate 1
# create new (empty) log files after rotating old ones
create
# use the date in backlog filenames
dateext
# compress backlogs with a delay
compress
# packages drop log rotation information into this directory
include /etc/logrotate.d
# no packages own wtmp, or btmp -- we'll rotate them here
/var/log/wtmp {
missingok
weekly
create 0664 root utmp
rotate 7
}
/var/log/btmp {
missingok
weekly
create 0664 root utmp
rotate 7
}
# system-specific logs may be configured here
#!/bin/bash
#start
#-----------------------------------------------------------------------
#delete nginx cache if exists
rm -rf /var/cache/nginx
#restart server
DateStamp=$(date +"%Y%m%d %k:%M:%S");
echo $DateStamp >> /var/log/cron.reboot.log;
/sbin/shutdown -r now
#-----------------------------------------------------------------------
#end
# Example Schedule
# Reboot Server Daily @ 1:30 AM
# 30 01 * * * root /etc/cron.daily/server-reboot
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment