We'll be using rclone and a cronjob which will trigger a custom made bash script.
Install rclone
$ sudo apt install rclone
$ sudo rclone config
Set up a GoogleDrive repo by following the official guide. However, when you reach the step Use auto config
, make sure to answer no
if you're runnig a server without a screen. If you select yes
you'll end up not being able to login to google from your machine.
When you're all set you should have a repo inside rclone. Check your repo name with sudo rclone config
.
$ sudo rclone config
Current remotes:
Name Type
==== ====
my-gdrive drive
You can test if it's all working fine by uploading something. The :/
tells rclone to copy to GoogleDrive root. Append with :/folder
to create a subfolder. Verbose -vv
will give you a progressbar and output.
$ sudo rclone copy test.txt my-gdrive:/
$ sudo rclone -vv copy test.txt my-gdrive:/
This (ugly, but working) bash script will copy all folders inside the array backup_folders
. It'll create a subdir called rclone-backup-<date-time>
inside your GoogleDrive.
#!/bin/bash
backup_folders=(
'/etc' # etc configs
'/home' # all home folders
'/var/lib/docker/volumes' # docker volumes
'/var/www' # html data
)
backup_dest_repo='my-gdrive:/'
backup_repo_date=$(date +'%Y-%m-%d-%H:%M:%S')
backup_repo_folder="rclone-backup_$backup_repo_date"
backup_repo_path=$backup_dest_repo$backup_repo_folder
for backup_path in "${backup_folders[@]}"
do
echo "$backup_repo_date: STARTING BACKUP: "$backup_path
sudo rclone -vv copy $backup_path "$backup_repo_path$backup_path" --copy-links
done
To trigger this at 2am every day and output logs, create a logfolder and set up cronjob.
$ sudo mkdir /var/log/backup
$ sudo crontab -e
0 2 * * * /bin/bash /home/user/scripts/backup/run.sh >> /var/log/backup/rclone.log 2>&1
All set!