Skip to content

Instantly share code, notes, and snippets.

@chazlarson
Last active December 27, 2019 22:34
Show Gist options
  • Save chazlarson/51a9843f84ed33b726d07aedf1b59ab4 to your computer and use it in GitHub Desktop.
Save chazlarson/51a9843f84ed33b726d07aedf1b59ab4 to your computer and use it in GitHub Desktop.
Upload a lot of stuff to a google drive [presumably a teamdrive], cycling through a set of service accounts at 500GB each. Assumptions here are: you have 100 service accounts. Their JSON credential files are named `sa-1.json` through `sa-100.json` and are stored in `/opt/sa-json`. The files you want to upload are in `/files/to/upload`
#!/bin/bash
COUNTER=1
SOURCE="/files/to/upload"
DESTINATION="remote:path"
JSON_LOC="/opt/sa-json"
while [ $COUNTER -lt 100 ]; do
echo Using service account sa-$COUNTER
/usr/bin/rclone copy -v --delete-excluded \
--fast-list --checkers=32 --transfers=16 --max-transfer 500G \
--stats 5s --drive-service-account-file=$JSON_LOC/sa-$COUNTER.json \
--log-file=sync.log --drive-chunk-size 64M \
$SOURCE $DESTINATION
let COUNTER=COUNTER+1
done
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment