Skip to content

Instantly share code, notes, and snippets.

@eladnava
Last active March 11, 2024 10:21
Show Gist options
  • Save eladnava/96bd9771cd2e01fb4427230563991c8d to your computer and use it in GitHub Desktop.
Save eladnava/96bd9771cd2e01fb4427230563991c8d to your computer and use it in GitHub Desktop.
Automatically backup a MongoDB database to S3 using mongodump, tar, and awscli (Ubuntu 14.04 LTS)
#!/bin/sh
# Make sure to:
# 1) Name this file `backup.sh` and place it in /home/ubuntu
# 2) Run sudo apt-get install awscli to install the AWSCLI
# 3) Run aws configure (enter s3-authorized IAM user and specify region)
# 4) Fill in DB host + name
# 5) Create S3 bucket for the backups and fill it in below (set a lifecycle rule to expire files older than X days in the bucket)
# 6) Run chmod +x backup.sh
# 7) Test it out via ./backup.sh
# 8) Set up a daily backup at midnight via `crontab -e`:
# 0 0 * * * /home/ubuntu/backup.sh > /home/ubuntu/backup.log
# DB host (secondary preferred as to avoid impacting primary performance)
HOST=db.example.com
# DB name
DBNAME=my-db
# S3 bucket name
BUCKET=s3-bucket-name
# Linux user account
USER=ubuntu
# Current time
TIME=`/bin/date +%d-%m-%Y-%T`
# Backup directory
DEST=/home/$USER/tmp
# Tar file of backup directory
TAR=$DEST/../$TIME.tar
# Create backup dir (-p to avoid warning if already exists)
/bin/mkdir -p $DEST
# Log
echo "Backing up $HOST/$DBNAME to s3://$BUCKET/ on $TIME";
# Dump from mongodb host into backup directory
/usr/bin/mongodump -h $HOST -d $DBNAME -o $DEST
# Create tar of backup directory
/bin/tar cvf $TAR -C $DEST .
# Upload tar to s3
/usr/bin/aws s3 cp $TAR s3://$BUCKET/
# Remove tar file locally
/bin/rm -f $TAR
# Remove backup directory
/bin/rm -rf $DEST
# All done
echo "Backup available at https://s3.amazonaws.com/$BUCKET/$TIME.tar"
@eladnava
Copy link
Author

eladnava commented Jun 2, 2019

@TopHatMan What is the exact error message you are facing?

@SuperStar518
Copy link

@eladnava
will it be okay if the mongodump size is over 11G?

@eladnava
Copy link
Author

Hi @alidavid0418,
Should be fine, you will need at least 23GB free space on your / mounted partition. S3 definitely supports large files. :)

@SuperStar518
Copy link

@eladnava
Thank you for your kind attention and confirmation. +1

@kshitijjind
Copy link

error parsing command line options: expected argument for flag -h, --host', but got option -d'

@siddheshjayawantv3it
Copy link

how to backup MongoDB(ECS container) data backup on s3 bucket
how to run the backup script and where?

@borekbb
Copy link

borekbb commented Mar 5, 2024

and a simpler way to do all this is: mongodump --archive --gzip | aws s3 cp - s3://my-bucket/some-file

clean and simple! thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment