Skip to content

Instantly share code, notes, and snippets.

@jcockhren
Created November 12, 2013 00:07
Show Gist options
  • Star 1 You must be signed in to star a gist
  • Fork 1 You must be signed in to fork a gist
  • Save jcockhren/7d33ee39479e9a3ee6f4 to your computer and use it in GitHub Desktop.
Save jcockhren/7d33ee39479e9a3ee6f4 to your computer and use it in GitHub Desktop.
The is a full example of using a both git backed pillar and states to implement a scheduler that makes daily mysql db dumps and uploads them to s3. Still needs to be made generic enough to handle multiple DB types and remote data stores. backup.init: This verifies that there exists a special DB user for backups and grants the need grants for all…
#!pydsl
import datetime
tod = datetime.datetime.today().strftime('%m%d%Y_%H%M')
backup = state('backup')
for name,data in __pillar__['sites'].iteritems(): # iterates of a client's list of sites in pillar
dbname = data['db_name']
file = "%(db)s_%(tod)s.sql" % {'db': dbname, 'tod':tod}
command = "mysqldump -usophic %(db)s > /tmp/%(file)s" % {'db': dbname, 'file':file} # this should be relative to the used Db backend. The command assumes only mysql dbs.
backup.mysql_database.present(name=dbname,connection_pass=__pillar__['gogo'])
backup.cmd.run(name=command, shell='/bin/bash').require(mysql_database=dbname).require_in(module='backup')
backup.module.run(name=__pillar__['backup']['location']+'.put', bucket=__pillar__['backup']['bucket_name'], path="clienta/"+file,local_file="/tmp/"+file)
#!pydsl
bkdeps = state('bkdeps')
bkdeps.pkg.installed(name='python-mysqldb')
prep = state('prep')
prep.mysql_user.present(host='localhost',allow_passwordless=True,connection_pass=__pillar__['gogo']).require(pkg='python-mysqldb')
for name,data in __pillar__['sites'].iteritems(): # iterates of a client's list of sites in pillar
dbname = data['db_name']
remote_folder = data['backup_folder']
prep.mysql_grants.present(user='backupuser',grant='select,lock tables',database=dbname+'.*',connection_pass=__pillar__['gogo'])
include('backup.calls')
s3.keyid: YOYOYO
s3.key: jsjsjsjsjsjsjsj
s3.service_url: s3.amazonaws.com
sites:
example.com:
node: pluto.myhost.com
webserver: nginx
webserver_config: clienta
base_path: /apps
real_path: /apps/clienta
repo: git@github.com:clients/examplecom.git
deploy_branch: master # needed for automated deployments (coming soon)
db_type: mysql
db_name: clienta_db
backup_folder: examplecom/ #remote backup folder
backup:
location: s3 # I want to support cloudfiles as well.
bucket_name: clientabucket
frequency: daily
retention_days: 7 # not used yet
schedule:
db_backup:
function: state.sls
hours: 24
args:
- backup
blahblah: 'yeahyeahyeah'
gogo: 'nonononono'
base:
'*':
- sec
'client:clienta': # this is a grain sat on a minion with the clienta stuff
- match: grain
- clienta
- backup
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment