Skip to content

Instantly share code, notes, and snippets.

@caraboides
Last active August 2, 2023 06:11
Show Gist options
  • Star 53 You must be signed in to star a gist
  • Fork 23 You must be signed in to fork a gist
  • Save caraboides/7679bb73f4f13e36fc2b9dbded3c24c0 to your computer and use it in GitHub Desktop.
Save caraboides/7679bb73f4f13e36fc2b9dbded3c24c0 to your computer and use it in GitHub Desktop.
Simple script to backup MongoDB to S3, without waste diskspace for temp files. And a way to restore from the latest snapshot.
#!/bin/sh
set -e
HOST=localhost
DB=test-entd-products
COL=asimproducts
S3PATH="s3://mongodb-backups-test1-entd/$DB/$COL/"
S3BACKUP=$S3PATH`date +"%Y%m%d_%H%M%S"`.dump.gz
S3LATEST=$S3PATH"latest".dump.gz
/usr/bin/aws s3 mb $S3PATH
/usr/bin/mongodump -h $HOST -d $DB -c $COL -o - | gzip -9 | aws s3 cp - $S3BACKUP
aws s3 cp $S3BACKUP $S3LATEST
# Restore
echo -n "Restore: "
echo -n "aws s3 cp $S3LATEST - |gzip -d | mongorestore --host $HOST --db $DB -c $COL - "
@innokentiyt
Copy link

How the script works with very large (~60-100 GB) collections dumps?

@caraboides
Copy link
Author

@innokentiyt It works ;-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment