Skip to content

Instantly share code, notes, and snippets.

@georgeredinger
Created August 25, 2017 19:46
Show Gist options
  • Star 1 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save georgeredinger/76174b8ca71062b49f2b0c336e886e47 to your computer and use it in GitHub Desktop.
Save georgeredinger/76174b8ca71062b49f2b0c336e886e47 to your computer and use it in GitHub Desktop.
Tutorial for making an encrypted backup on cloud storage using rclone.

Amazon Cloud Drive Advisory

Over the past few days, a security issue came to light regarding an authentication service used by another tool, acd_cli. acd_cli had its authentication keys for Amazon Cloud Drive blocked after Amazon engineers reviewed their source code for their authentication service and found a security issue.

This morning, rclone's authentication keys were apparently blocked by Amazon. No reason has been brought forth at this time, and rclone does not use a cloud service to authenticate users - it uses a local web server. Theories include an influx of rclone users after acd_cli was blocked, people extracting the API authentication keys from rclone and using them with acd_cli, a combination of both, or Amazon wanting to clamp down on heavy users with several terabytes of data, and blocking the tools they use to do so.

The Amazon rep that I spoke with over the phone speculated that it "may be because of a recent event," but offered nothing more. I was offered a full refund, four months into my annual billing cycle.

I will update this notice as more information becomes available, but at this time, I don't recommend ACD - G Suite has become more popular lately, it offers unlimited for $10/month (they don't enforce their limit requiring multiple users to obtain unlimited at this point in time), and has so far been very reliable.


This tutorial is for creating an encrypted backup on a Unix-like system using rclone. rclone supports a variety of cloud storage providers, including Amazon Drive and Google Drive (which gives unlimited accounts for business/education).

The steps on Windows are likely very similar but I don't have any experience with Windows, the input of anyone else who may have this experience would be appreciated.

Note that this guide was originally Amazon Drive specific. Reddit titles cannot be edited after posting and I don't want to break any links.

I maintain this guide on GitHub Gist.

Latest Revision: 18 May 2017

Step 1: Choose and Sign Up for a Cloud Storage Service

Both Amazon Drive and Google Drive offer unlimited storage accounts.

With Amazon, anybody can get an unlimited storage account for $60. It's not included with a Prime subscription. Redditors have 10s of terabytes, but there have been reports of accounts being taken down, mostly for high downstream usage combined with unencrypted pirated content. However, as long as you encrypt, which this tutorial includes instructions on how to do, it's unlikely that your content will be taken down.

Google issues unlimited accounts only for business and education users. Education accounts must be issued by a Google Apps admin for your school, however, Google Apps for Education doesn't cost the school anything, so you may be able to keep your account after you leave your institution. You can also register your own business account for $10/month, provided you own a domain name to use. Despite the stated 5 account minimum for unlimited storage, Redditors have found that only one account will still receive unlimited storage.

rclone supports a number of other storage providers as well, including AWS S3, Backblaze B2, Dropbox, Google Cloud Platform, Hubic, OneDrive, OpenStack Swift hosts, and Yandex Disk.

OneDrive is of particular note as it seems that everyone gives out free storage for OneDrive. Office 365 includes 1TB, and there's a good chance that if you've bought an external drive semi-recently (especially Seagate) that you can register the serial online for a year of free storage. There are lots of other promos I've seen as well.

Step 2: Install rclone

You will need to install rclone on the machine you want to back up. As far as I am aware, it isn't currently availiable by package manager on Linux, so you will need to install it manually. I will cover installation on Linux and installation on macOS using Homebrew, a third-party package manager for macOS that's pretty great.

rclone is a Go program that is distributed as a single binary.

If you are trying to use rclone on Windows, you'll have to use the instructions on the rclone website, but /u/WouterNL has a note on adding rclone to your PATH on Windows so that you don't have to specify the full path to the executable.

Linux install

These commands have been copied mostly verbatim from the rclone.org website, with the exception that I have changed curl to wget since curl is not included by default on some distributions.

wget http://downloads.rclone.org/rclone-current-linux-amd64.zip
unzip rclone-current-linux-amd64.zip
cd rclone-*-linux-amd64

sudo cp rclone /usr/sbin/
sudo chown root:root /usr/sbin/rclone
sudo chmod 755 /usr/sbin/rclone

sudo mkdir -p /usr/local/share/man/man1
sudo cp rclone.1 /usr/local/share/man/man1/
sudo mandb 

macOS Install

Install Homebrew using the installation command on their website, then run brew install rclone in Terminal.

Step 3: Authorise rclone to use your Cloud Storage Account

rclone requires authorisation for any of the cloud storage providers that it supports. With some providers, you can get API keys from the provider's website, whereas with others, you must complete an OAuth flow in a browser (rclone's developers never see your keys, it uses a local web server).

  1. Run rclone config.
  2. Press n to create a new remote.
  3. Specify a name to reference this remote in commands. For this tutorial, we will simply use the name remote.
  4. Select one of the cloud storage providers to authenticate with.
  5. Follow the instructions to authenticate with the cloud provider. See the note below about headless systems for cloud providers that require OAuth authorization.

OAuth Authorization on Headless Systems

On headless systems, it can be difficult to complete the OAuth authorization flow. In this case, you can run rclone on your personal machine, using the rclone authorize command. Details will be included on the headless system, and installation is the same as listed above.

Step 4: Configure Encryption

There are two ways you can configure encryption. You can encrypt your cloud account at the top level, meaning all folders (that were uploaded with rclone) will have encrypted names.

You can also leave the top-level directory names decrypted for identification from the Web GUI, and any apps that the provider may have.. This has the disadvantage that you have to run this configuration process for every single folder in your cloud account. You can edit your config file manually if you have lots of folders to configure.

  1. Run rclone config.

  2. Press n.

  3. Type a name for the encrypted container. For this tutorial, we will assume the name secret.

  4. Press 5.

  5. If you are encrypting the names of your top-level directories, just use remote: here.

    If you are using decrypted top-level names, specify the path, for example remote:/Backups/, and keep in mind that you will need to create one of these containers for each top-level directory.

  6. Type standard for encrypted file names.

  7. Choose a passphrase or generate one. It is very important you remember your passphrase. If you forget it, your backups will be irretrievable and there is no reset process.

  8. Choose a salt or generate one. This is optional but recommended, particularly with non-random passphrases. Same as with the passphrase, you must remember this salt.

  9. Press y to confirm the details and q to close rclone.

Repeat these steps for as many encrypted containers as are needed for decrypted top-level directory names.

Step 5: Upload Your First Backup

You need to know what name you selected for your encrypted container in Step 4.

If you decided to encrypt your entire ACD, including top-level directories, specify which folder you want to place backups in (I'll assume it's Backups) by running rclone sync /path/to/local/folder secret:/Backups/.

If you decrypted the top-level directory names, as in, you put in more than just remote: when you setup encryption in Step 4, then you don't specify that folder when backing up: rclone sync /path/to/local/folder secret:.

If this is going to be a long upload on your connection, change the command like this so that it will record output to a file and not get killed when you log out.

setsid [command] --log-file /path/to/file.log &>/dev/null

Step 6: Automatically Update Backups

These instructions are for Unix-like systems, including macOS. /u/z_Boop has instructions on how to schedule automatic updates under Windows, see his comment for details.

You will want to create a cron job to automatically backup incremental changes. For those unaware, cron jobs are a way to schedule tasks to run at intervals on Unix-like systems. You can read more about them, but this will guide you on how to create a cron job to run every hour to make backups. You'll have to make some decisions first though:

  • Decide whether you want to use sync or copy.

    Sync will mirror a directory exactly to the destination from the source, deleting files from the destination that have been removed on the source. This is good, for example, for the working directory of an application (like the Plex database for example) where the application expects the directory exactly as it was left, and already handles backups.

    Copy will copy new files to the destination, but it won't remove files on the destination that have been removed from the source. This is good, for example, for a backups folder, where the old backups will be automatically deleted from the local disk to save space after a period of time, but you might as well leave them on ACD since it is unlimited space.

    I will represent this in the cron job with [rclone mode]

  • Decide whether you want to log output to a file.

    You can log output of backup jobs to a file. In most cases this is unnecessary but if you run into issues you can log out to a file by adding --log-file /path/to/file.log just before &>/dev/null, with a space on either side of this snippet.

  1. Run crontab -e
  2. At the end of the file, on a blank line, add this line: 0 * * * * /usr/bin/setsid /usr/sbin/rclone [rclone mode] /path/to/local/folder secret:[optional path] &>/dev/null. [optional path] should reflect what you did when you made your initial backup.
  3. Save and exit your file. If you chose nano as your editor, the easiest editor suggested by crontab -e, then you need to press Control+O, press enter, then press Control+X.

One problem with this setup is that it will create multiple upload jobs if the interval you've set comes around before your upload has finished. You can mitigate this with a systemd service to run your backup job, and then configuring your crontab to spin up your systemd job. This is a little more complex, if you want assistance in settiing this up, contact me. If demand is high for this type of setup, then I will add it to the tutorial.

Step 7: Restore

To restore your backup, you first need to setup your ACD with rclone as detailed earlier if this is a new machine, and set up your encrypted containers. Backing up your configuration file can help with this process, but as long as you know your Amazon login and your encryption keys and salts then you will have no issue. If you do choose to back up your config file then keep in mind that the keys are in plain text in that file.

After that, you can run:

rclone sync secret:[optional path] /path/to/local/folder

You can follow the same steps as the initial sync for running it in the background.

Mounting your Cloud Account with FUSE

This is a new section, particularly suited to those who want to run media servers like Plex off their cloud storage. This requires running rclone on Linux, with FUSE installed, or macOS, with FUSE for macOS installed.

I have found the most successful command for a media server FUSE mount to be:

/usr/sbin/rclone mount --umask 0 --allow-other --max-read-ahead 200M secret: MOUNT_POINT &

You can stop a FUSE mount with fusermount -u MOUNT_POINT on Linux, or umount -u MOUNT_POINT on macOS.

You will probably want to put these commands into a system service. systemd is used on lots of Linux distributions these days, a sensible systemd service is included below. Remember to change the mount point and user.

[Unit]
Description=rclone FUSE mount
After=network.target

[Service]
Type=simple
User=USER
ExecStart=/usr/sbin/rclone mount --umask 0 --allow-other --max-read-ahead 200M secret: MOUNT_POINT
ExecStop=/bin/fusermount -u MOUNT_POINT
Restart=always

[Install]
WantedBy=multi-user.target

That's it!

Thanks for reading through my tutorial. If you have any questions, feel free to comment or message me and I will do my best to help you. As mentioned, I have no experience with Windows, although I imagine it will be similar. If anyone wants to contribute information about Windows, you are welcome to do so, just send me a message, I will credit you.

I maintain this document on GitHub Gist, this latest revision was published on 18 May 2017 (revisions).

  • 24 October 2016: First version
  • 25 October 2016: Added comment about using setsid to run uploads in the background.
  • 21 November 2016: Updated version, improved general grammar and style.
    • Updated the introductory paragraph to be less personal.
    • Added a link to the changelog on GitHub Gist.
    • Added links to the ACD usage of other redditors to the Sign Up for ACD step.
    • Reworded the step where you choose the remote name for ACD in Step 3 to say that I will assume that the name amazon is used in order to make the tutorial simpler.
    • Reworded the first paragraph in the Configure Encryption step to mention the Plex Cloud Beta.
    • Redid the instructions in the Configure Encryption step so that I don't need to duplicate the content.
    • Reworded the Upload Your First Backup section to make it more clear how to run an upload in the background.
    • Changed the Automatically Update Backups to highlight the difference between sync and copy, and whether you want to log the output of running backups. I want to draw the attention of those who completed the original tutorial to make sure that they are using the desired setting for sync vs copy, I didn't word this clearly enough with the first verison and want to apologise for that.
    • Update the restore section to mention backing up config files.
    • Rewrite the conclusion, removing bit about being new and late at night, keeping the thanks for the gold bit.
  • 22 November 2016: Included note from /u/z_Boop about scheduling tasks on Windows.
  • 23 November 2016: Fixed a Markdown issue where angle brackets were interpreted as HTML tags and prevented content from displaying. Also fixed ACD not being linked in the first paragraph.
  • 25 November 2016: Added note from /u/WouterNL about adding rclone to the Windows PATH.
  • 27 November 2016: Fixed formatting issue with angle brackets again.
  • 12 December 2016: Fixed Debian installation instructions to actually use wget instead of curl.
  • 31 December 2016: The numbering was not done properly after Step 4. Fixed. Thanks to /u/loki_racer for pointing this out.
  • 13 January 2017: Updated version, including:
    • Edited the tutorial to not be ACD-specific
    • Restructored some sections of the tutorial to be more neutral
    • Included new section on FUSE mounting
  • 30 January 2017: General housekeeping
    • Fixed a typo in the systemd service for FUSE mounting. Thanks to /u/josho493 for pointing this out.
    • Added a note to the cron scheduling section about multiple backup tasks and using systemd.
    • Fixed the year for the release notes - it's 2017 now.
  • 18 May 2017: Added advisory related to the changing situation with Amazon Cloud Drive
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment