Skip to content

Instantly share code, notes, and snippets.

@iangreenleaf
Created January 18, 2010 07:12
Show Gist options
  • Save iangreenleaf/279849 to your computer and use it in GitHub Desktop.
Save iangreenleaf/279849 to your computer and use it in GitHub Desktop.
rsync with retries
#!/bin/bash
### ABOUT
### Runs rsync, retrying on errors up to a maximum number of tries.
### Simply edit the rsync line in the script to whatever parameters you need.
# Trap interrupts and exit instead of continuing the loop
trap "echo Exited!; exit;" SIGINT SIGTERM
MAX_RETRIES=50
i=0
# Set the initial return value to failure
false
while [ $? -ne 0 -a $i -lt $MAX_RETRIES ]
do
i=$(($i+1))
rsync -avz --progress --partial -e "ssh -i /home/youngian/my_ssh_key" /mnt/storage/duplicity_backups backupuser@backup.dreamhost.com:.
done
if [ $i -eq $MAX_RETRIES ]
then
echo "Hit maximum number of retries, giving up."
fi
@felipou
Copy link

felipou commented Feb 14, 2016

I used your code as reference to create a generic bash retry command: https://gist.github.com/felipou/6fbec22c4e04d3adfae5
Thanks for sharing it!

@brackendawson
Copy link

This has a concurrency issue. It's not safe to retry wth --partial specified alone. Previous failed rsync processes on the target can write a short version of the file over a completed one. Add the --delay-updates option, this will write to a temporary file and only copy complete files to the actual file path, rendering stray processes inert.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment