- Using rclone link Google Drive
ml rclone
rclone config
Choose Google Drive and follow instructions to set up a end point called gdrive
.
-
Create a script that runs the
rclone sync
, calledbackup_to_gdrive.sh
(referenced below). -
Create exclusion file
-
Create directory on the gdrive where this backup will go
rclone mkdir gdrive:tallboy/mcuma
- Do an initial sync, either by running our script or by explicit command:
rclone -v sync /home/mcuma gdrive:tallboy/mcuma --exclude-from /home/mcuma/bin/scripts/exclude.txt --skip-links >& /home/mcuma/bin/scripts/backup.log
- Set up a cron job
crontab -e
0 2 * * * /home/mcuma/bin/scripts/backup_to_gdrive.sh
This will run the backup daily at 2am.
- rclone currently does not support symbolic links, which is why we use
--skip-links
option. Links support is planned for the future. - use exclude file on dot directories and other potential files as copying a lot of small files is very slow. It took me 3 days to copy 50 GB in 60000 files
- by default the UofU Google Drive permissions are set to be viewable by everyone at the U with a link. It is not easy to mass change this to all files being private. In Google Drive web based GUI one can only mass select files in current directory and the permission change does not propagate into subdirectories. www.whohasaccess.com can scan for files and turn off access but it seems like it can only go 2 levels deep