Skip to content

Instantly share code, notes, and snippets.

@howellcc
Last active May 11, 2024 11:32
Show Gist options
  • Save howellcc/70f5c45a9902f9a3fe4e566ceedf1bca to your computer and use it in GitHub Desktop.
Save howellcc/70f5c45a9902f9a3fe4e566ceedf1bca to your computer and use it in GitHub Desktop.
Backup script for Hubitat Elevation that runs on Synology NAS
#!/bin/bash
#https://gist.github.com/howellcc/70f5c45a9902f9a3fe4e566ceedf1bca
#This is a backup script that I'm running on my Synology NAS to back up my Hubitat Elevation.
he_login=[Hubitat Username]
he_passwd=[Hubitat Password]
he_ipaddr=[Hubitat IP Address]
backupdir='[Backup Location]' #no trailing space
hubname='Hubitat' #in case you have multiple hubs
versionsToKeep=60
#fix a silly off by 1 error
((versionsToKeep=versionsToKeep+1))
cookiefilename='cookiefile.txt'
cookiefile=$backupdir/$cookiefilename
backupfilename=${hubname}_$(date +%Y-%m-%d-%H%M).lzf
curl -k -c $cookiefile -d username=$he_login -d password=$he_passwd https://$he_ipaddr/login
curl -k -sb $cookiefile https://$he_ipaddr/hub2/localBackups \
| jq '.[-1].path' \
| sed -e 's/.*dbBackup\///' -e s/\"// \
| xargs -I @ curl -k -sb $cookiefile https://$he_ipaddr/hub//backupDB?fileName=@ -o $backupdir/$backupfilename \
|| exit 1
#for some reason these end up with 777 permissions
chmod 440 $backupdir/$backupfilename
#Delete all but the most recent X backups
ls -tp /volume1/ImmutableBackups/Hubitat/*.lzf | grep -v '/$' | tail -n +$versionsToKeep | xargs -I {} rm -- {}
@howellcc
Copy link
Author

howellcc commented Feb 22, 2024

I had to update this script because something changed on Hubitat's end and it no longer worked. I think this broke with the 2.3.7.138 release on 12/22/2023.

@howellcc
Copy link
Author

The earlier breakage of this script was preventing the download of new backups, but was still removing backups that were over 60 days old. This was slowly removing ALL of my backups. I adjusted the script to keep the 60 most recent versions, regardless of how old they are. This prevents my previous issue.

@marktheknife
Copy link

Thanks for the update! I haven’t had a reason to check my backup destination folder for a couple months so hadn’t even realized there was an issue!!

@howellcc
Copy link
Author

Thanks for the update! I haven’t had a reason to check my backup destination folder for a couple months so hadn’t even realized there was an issue!!

Yeah... my cron job started reporting errors when it ran out of backups to delete. Otherwise I may have never found out. If/When I get some time, there will likely be a future iteration that includes notification should backup retrieval fail. Though that may require re-writing it in python.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment