Skip to content

Instantly share code, notes, and snippets.

@plembo
Last active May 2, 2023 21:38
Show Gist options
  • Star 4 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save plembo/5891d2d7cd206cd3eb1047cceb96e4bf to your computer and use it in GitHub Desktop.
Save plembo/5891d2d7cd206cd3eb1047cceb96e4bf to your computer and use it in GitHub Desktop.
Embedding an encrypted folder with rclone

Embedding an encypted folder with rclone

There are many use cases for rclone. Mine was to "sync" with my "free" 15 GB Google Drive on Linux. While most of my files aren't particularly sensitive, there are a number that are. Until recently, I encrypted these manually using 7-zip. The problem with that was having to decrypt manually every time I wanted to look at them.

The Problem

Rclone has an encryption overlay that can be used to encrypt either a single folder or all folders in a configured remote (a networked storage system like Google Drive, OneDrive, AWS S3 or GCS). Setting it up is pretty easy, but an unintended consequence not discussed in any of the many tutorials on rclone encryption is that if your encrypted folder is a folder on an already existing remote (which is my preferred setup), using sync on the whole remote will remove the unencrypted files in the local copy of the folder and replace them with encrypted files.

The Solution

The solution to this conundrum isn't hard: use the "--filter" option to exclude the folder you're using to store encrypted files.

An example is worth a thousand words.

Assume a remote named "gdrive" configured to work with Google Drive (using rclone's "drive" connector, https://rclone.org/drive/). Also assume a crypt type remote named "gdrive-crypt" configured to encrypt files in "gdrive:/Private" (https://rclone.org/crypt/).

To sync gdrive to a local folder called "GDrive", you'd issue this command:

$ rclone sync gdrive: ~/GDrive

Now say you have some files in ~/GDrive/Private that you'd like encrypted up on Google Drive:

$ rclone sync ~/GDrive/Private gdrive-crypt:

So far, so good. But what if you sync down from Google Drive to the ~/GDrive folder again?

All the files previously readable in ~/GDrive/Private are now in their encrypted, undreadable, form!

I can restore the readable versions by running the following:

$ rclone sync gdrive-crypt: ~/GDrive/Private

But that's a bit of work. Why not just do this to begin with:

$ clone sync gdrive: ~/GDrive --filter - /Private/**

Or something a little less cumbersome: put that pattern in a file, say ~/etc/gdrive-filter.conf, that looks like this:

- /Private/**

Once that's done, you can run this command:

$ rclone sync gdrive: ~/GDrive --filter-from ~/etc/gdrive-filter.conf

What makes this even better is that the filter option will also work in the other direction:

$ rclone sync ~/GDrive gdrive: --filter-from ~/etc/gdrive-filter.conf

Scripts

WARNING: Before doing anything, be sure to back up your existing local GDrive folder:

$ tar -cf GDrive.tar GDrive

I have written a couple of scripts and a filter file to make all this a bit easier:

gdrive-down

# Download from Google Drive
#!/bin/bash
/usr/bin/rclone sync gdrive: ${HOME}/GDrive --filter-from ${HOME}/etc/gdrive-filter.conf --fast-list -P
/usr/bin/rclone sync gdrive-private: ${HOME}/GDrive/Private --fast-list -P

gdrive-up

# Upload to Google Drive
#!/bin/bash
/usr/bin/rclone sync ${HOME}/GDrive gdrive: --filter-from ${HOME}/etc/gdrive-filter.conf --fast-list -P
/usr/bin/rclone sync ${HOME}/GDrive/Private gdrive-private: --fast-list -P

gdrive-filter.conf

- /Private/**

Notes

  1. It's best to start out with a clean (as in empty) local directory. Over time, I've had rclone choke when trying to sync certain files for no apparent reason. All sync programs do this, which is why it's always important to make sure you've got redundant backups.

  2. Part of my rclone workflow includes a cron job on the home server that runs the gdrive-down script early every morning. Those files then get backed up to AWS S3 later on for redundancy.

  3. This works with the Windows port of rclone as well, although I haven't established a workflow (scripts, config file) for it yet.

  4. As far as local storage is concerned, my personal Ubuntu laptop's SSD disk is encrypted and password protected by LUKS.

  5. The main drawback to this solution is, of course, that you can't read the encrypted files in a web browser session or the Google Drive mobile app -- the classic conflict between security and convenience. There are a number of strategies for dealing with this. For me what works is regularly reviewing my files to see what may need to be moved out of Private for a while so it can be accessed over the web (like tax documents during tax time), versus things that don't (ssh and gpg private keys).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment