Skip to content

Instantly share code, notes, and snippets.

@howellcc
Last active April 30, 2024 23:13
Show Gist options
  • Star 4 You must be signed in to star a gist
  • Fork 1 You must be signed in to fork a gist
  • Save howellcc/70f5c45a9902f9a3fe4e566ceedf1bca to your computer and use it in GitHub Desktop.
Save howellcc/70f5c45a9902f9a3fe4e566ceedf1bca to your computer and use it in GitHub Desktop.
Backup script for Hubitat Elevation that runs on Synology NAS
#!/bin/bash
#https://gist.github.com/howellcc/70f5c45a9902f9a3fe4e566ceedf1bca
#This is a backup script that I'm running on my Synology NAS to back up my Hubitat Elevation.
he_login=[Hubitat Username]
he_passwd=[Hubitat Password]
he_ipaddr=[Hubitat IP Address]
backupdir='[Backup Location]' #no trailing space
hubname='Hubitat' #in case you have multiple hubs
versionsToKeep=60
#fix a silly off by 1 error
((versionsToKeep=versionsToKeep+1))
cookiefilename='cookiefile.txt'
cookiefile=$backupdir/$cookiefilename
backupfilename=${hubname}_$(date +%Y-%m-%d-%H%M).lzf
curl -k -c $cookiefile -d username=$he_login -d password=$he_passwd https://$he_ipaddr/login
curl -k -sb $cookiefile https://$he_ipaddr/hub2/localBackups \
| jq '.[-1].path' \
| sed -e 's/.*dbBackup\///' -e s/\"// \
| xargs -I @ curl -k -sb $cookiefile https://$he_ipaddr/hub//backupDB?fileName=@ -o $backupdir/$backupfilename \
|| exit 1
#for some reason these end up with 777 permissions
chmod 440 $backupdir/$backupfilename
#Delete all but the most recent X backups
ls -tp /volume1/ImmutableBackups/Hubitat/*.lzf | grep -v '/$' | tail -n +$versionsToKeep | xargs -I {} rm -- {}
@howellcc
Copy link
Author

Added functionality to only keep backups for X days.

@marktheknife
Copy link

Hi thanks for sharing this. I just wanted to confirm, the backup file is the most recent one already on the hub, not a new backup taken at the time the script runs?

@howellcc
Copy link
Author

Correct. It downloads the most recent backup, but does not create a new one. My hubitat it set to create a backup at 2am every day. I have this script scheduled to run at 2:10am to download the freshly created backup.

@marktheknife
Copy link

Cool thank you for clarifying. I noticed the .lzf file name includes the time the script executed, but now I see from the hub’s backup settings page that the actual backup times are at 3am as expected (my hub’s regular maintenance/backup time.

Now I have this script scheduled to run at 3:45am daily, so thanks again!

@howellcc
Copy link
Author

howellcc commented Feb 22, 2024

I had to update this script because something changed on Hubitat's end and it no longer worked. I think this broke with the 2.3.7.138 release on 12/22/2023.

@howellcc
Copy link
Author

The earlier breakage of this script was preventing the download of new backups, but was still removing backups that were over 60 days old. This was slowly removing ALL of my backups. I adjusted the script to keep the 60 most recent versions, regardless of how old they are. This prevents my previous issue.

@marktheknife
Copy link

Thanks for the update! I haven’t had a reason to check my backup destination folder for a couple months so hadn’t even realized there was an issue!!

@howellcc
Copy link
Author

Thanks for the update! I haven’t had a reason to check my backup destination folder for a couple months so hadn’t even realized there was an issue!!

Yeah... my cron job started reporting errors when it ran out of backups to delete. Otherwise I may have never found out. If/When I get some time, there will likely be a future iteration that includes notification should backup retrieval fail. Though that may require re-writing it in python.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment