Create a gist now

Instantly share code, notes, and snippets.

Automatically backup a MongoDB database to S3 using mongodump, tar, and awscli (Ubuntu 14.04 LTS)
#!/bin/sh
# Make sure to:
# 1) Name this file `backup.sh` and place it in /home/ubuntu
# 2) Run sudo apt-get install awscli to install the AWSCLI
# 3) Run aws configure (enter s3-authorized IAM user and specify region)
# 4) Fill in DB host + name
# 5) Create S3 bucket for the backups and fill it in below (set a lifecycle rule to expire files older than X days in the bucket)
# 6) Run chmod +x backup.sh
# 7) Test it out via ./backup.sh
# 8) Set up a daily backup at midnight via `crontab -e`:
# 0 0 * * * /home/ubuntu/backup.sh > /home/ubuntu/backup.log
# DB host (secondary preferred as to avoid impacting primary performance)
HOST=db.example.com
# DB name
DBNAME=my-db
# S3 bucket name
BUCKET=s3-bucket-name
# Linux user account
USER=ubuntu
# Current time
TIME=`/bin/date +%d-%m-%Y-%T`
# Backup directory
DEST=/home/$USER/tmp
# Tar file of backup directory
TAR=$DEST/../$TIME.tar
# Create backup dir (-p to avoid warning if already exists)
/bin/mkdir -p $DEST
# Log
echo "Backing up $HOST/$DBNAME to s3://$BUCKET/ on $TIME";
# Dump from mongodb host into backup directory
/usr/bin/mongodump -h $HOST -d $DBNAME -o $DEST
# Create tar of backup directory
/bin/tar cvf $TAR -C $DEST .
# Upload tar to s3
/usr/bin/aws s3 cp $TAR s3://$BUCKET/
# Remove tar file locally
/bin/rm -f $TAR
# Remove backup directory
/bin/rm -rf $DEST
# All done
echo "Backup available at https://s3.amazonaws.com/$BUCKET/$TIME.tar"
@eladnava
Owner
eladnava commented Jun 7, 2016 edited

Restore from Backup

Download the .tar backup to the server from the S3 bucket via wget or curl:

wget -O backup.tar https://s3.amazonaws.com/my-bucket/xx-xx-xxxx-xx:xx:xx.tar

Alternatively, use the awscli to download it securely.

Then, extract the tar archive:

tar xvf backup.tar

Finally, import the backup into a MongoDB host:

mongorestore --host {db.example.com} --db {my-db} {db-name}/
@wabirached

Thanks @eladnava, this worked great for me. That said, I noticed that the script only uploaded the backup file to s3 if I ran it manually, not if it ran through the cron job. When it ran through the cron job, it was creating the backup file correctly but was not able to upload it to s3.

The reason for that is because the home folder for the cronjob is different than your user's home folder (~/), so it could not find the aws config with the s3 bucket information. To fix this, you have to do the following:
1- cd ~/ and note the home folder's path
2- specify the home folder's path from step 1 at the beginning of the file as follow: export HOME=/your/home/folder

For instance, in my case, I had to add the following line at the top of my file for the cron job to upload the backup file successfully: export HOME=/home/ubuntu

@nesbtesh

Great thanks,

add username and password for authentication. Also please not it is bad practice not to have a database without authentication.

/usr/bin/mongodump -h $HOST -d $DBNAME -o $DEST --PASSWORD $PASSWORD --USERNAME $USERNAME

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment