How Hubitat Backup/Restore has Saved My Ass

I've spent many hours developing my own code on Hubitat and wrote a task on my server to download a backup from my Hubitat hub once a day.

Its happened a few times now where I accidentally messed up code on my hub that I put too much time in.

I was able to get it all back just by downloading a new backup, restoring a previous backup, grabbing the code in question and then restoring the latest backup again to copy the code back in.

Saved my ass twice now.

LOVE IT!

11 Likes

care to share how you have your server setup to do it? Just a shell script or something more involved. I have an ubuntu server I would love to automate my backups like yours/.

On your server just setup a cron job to do a wget from the following url. It will create a new backup and download the file.

http://[HubIP]/hub/backupDB?fileName=latest

4 Likes

Thanks. fairly straight forward.

1 Like

What is your wget command? When I run what you posted above it creates a file called “backupDB?fileName=latest” that is 2.4mb in size. It’s the right size to be a backup file, but the name is wrong.

When I go to that address through the browser it produces a time stamped file name.

Here's a batch file I run on a windows machine. Modify as you want. One thing you may need to tweak is the grabbing of the date info as the variable is different based on the locale settings on the machine.
The wget machine is at the bottom.

@echo off
pushd %~dp0
set hubaddress=hubitat.localdomain
set year=%DATE:~10,4%
set day=%DATE:~7,2%
set mnt=%DATE:~4,2%
set hr=%TIME:~0,2%
set min=%TIME:~3,2%
IF %day% LSS 10 SET day=0%day:~1,1%
IF %mnt% LSS 10 SET mnt=0%mnt:~1,1%
IF %hr% LSS 10 SET hr=0%hr:~1,1%
IF %min% LSS 10 SET min=0%min:~1,1%
set backuptime=Hubitat-%year%%mnt%%day%-%hr%%min%.lzf
.\wget\wget.exe -c --content-disposition -O .\%backuptime% http://%hubaddress%/hub/backupDB?fileName=latest
popd
2 Likes

Thanks

For anyone that is interested: This is the command I put in my crontab on my Ubuntu server:

00 4 * * * wget -O- http://192.168.20.20/hub/backupDB?fileName=latest >>/path/to/backup/folder/$(date -d "today" +"\%Y-\%m-\%d_\%H:\%M").lzf

That generates a file at 4am everyday and time stamps it.

4 Likes

Even simpler:

00 4 * * * wget --content-disposition -P /path/to/backup/folder/ http://192.168.20.20/hub/backupDB?fileName=latest

And this uses the fileName hubitat gives it, which has the firmware version appended also.

8 Likes

And if you want to remove files older then X days I use this in my crontab

@daily find /path/to/backup/folder/ -mindepth 1 -type f -mtime +X -delete

where X is the number of days old you want to remove -- I use 5

2 Likes

Should anyone want to do this in Windows with Curl

  1. Create a directory C:/hubBackup (or any name you want) to store the backups

  2. Create a daily scheduled task with the following action,filling the optional "start in" field with, C:/hubBackup/
    curl --url http://HubIP/hub/backupDB?fileName=latest -OJ

  3. To delete backups over 10 days old
    ForFiles /p "C:\hubBackup" /s /m *.lzf /d -10 /c "cmd /c del @file"

Note: matching files names are NOT overwritten

7 Likes

I wrote a linux shell script that works with rclone to upload the file to google drive and then delete the file from the local storage. It does not auto-delete, though. This is fine though, as even at relatively large backups (4 mb), it can hold 250 days using only 1 gb, leaving plenty of space for Gmail and your google drive files. The shell script should be kept in a directory called Hubitat_Backup and should be run through sudo crontab.

wget http://HubIP/hub/backupDB?fileName=latest
sudo cp /home/pi/Hubitat_Backup/'backupDB?fileName=latest' /home/pi/Hubitat_Backup/$(date +%m-%d-%Y.lzf) 2> /dev/null
sudo cp /home/pi/Hubitat_Backup/'backupDB?fileName=latest.1' /home/pi/Hubitat_Backup/$(date +%m-%d-%Y.lzf) 2> /dev/null
sudo rm /home/pi/Hubitat_Backup/*backupDB?fileName=latest*
sudo rclone copy /home/pi/Hubitat_Backup/*.lzf Google_Drive:Hubitat_Backups/
sudo rclone delete Google_Drive:Hubitat_Backups --min-age 30d
sleep 1s
sudo rm /home/pi/Hubitat_Backup/*.lzf

Make sure to change "pi" to your machine's home directory

Here's my VB6 program to download from all 4 hubs, run daily at 3:45 am:

Option Explicit

Private Declare Function URLDownloadToFile Lib "urlmon" Alias "URLDownloadToFileA" _
 (ByVal pCaller As Long, _
 ByVal szURL As String, _
 ByVal szFileName As String, _
 ByVal dwReserved As Long, _
 ByVal lpfnCB As Long) As Long

Sub Main()
    Dim ret As Long

    ret = URLDownloadToFile(0, "http://192.168.1.10/hub/backupDB?fileName=latest", "\\server\backups\Hubitat 0.lzf", 0, 0)
    ret = URLDownloadToFile(0, "http://192.168.1.11/hub/backupDB?fileName=latest", "\\server\backups\Hubitat 1.lzf", 0, 0)
    ret = URLDownloadToFile(0, "http://192.168.1.12/hub/backupDB?fileName=latest", "\\server\backups\Hubitat 2.lzf", 0, 0)
    ret = URLDownloadToFile(0, "http://192.168.1.13/hub/backupDB?fileName=latest", "\\server\backups\Hubitat 3.lzf", 0, 0)
End Sub

I have edited my program to auto-delete the files from google drive after 30d

Hi all

I'm not a programmer. Can anyone tell me how to setup something similar on my Mac?

Thanks.

EDIT: Don't worry I can do is using Automator. All good!

@dorian.workman I originally tried to do this on my Mac. Since Macs use unix, all the commands work as long as you install clone and wget. The issue is that cron has been depreciated on Macs (since OS 10.4), and launchd feels a bit too complicated for me. If somebody else could help @dorian.workman with launchd, it might be possible.

Thanks for the reply. I actually got it working using a calendar-triggered automator workflow that includes a shell script to move the file.

@dorian.workman I didn't think of that :grinning:! If you want to upload straight to google drive, you can et up clone and use that to move the file

No need, thanks. I just auto-download the file and then auto-move it from my Downloads folder to Documents. Documents is on my iCloud Drive already.

1 Like

This is awesome, thanks for sharing this.
I don't need daily backups, but just set this up for Weekly updates on Monday and ran a test run just now and it worked great!

Thank you!

1 Like

I actually took it one step further and wrote a fully self-contained apple script to download the file and move it. Script below for anyone who may find it useful. Just save this as an application and trigger it from a recurring calendar event. Easy.

on run

tell application "Safari"

tell window 1

set current tab to (make new tab with properties {URL:"http://HUB IP/hub/backupDB?fileName=latest"})

delay 5

close current tab

end tell

end tell

tell application "Finder"

set sourceFolder to alias "Macintosh SSD:Users:dworkman:Downloads:"

set targetFolder to alias "Macintosh SSD:Users:dworkman:Documents:Hubitat:"

set latestFile to last item of (sort (files in (sourceFolder)) by modification date)

move latestFile to targetFolder

set movedFile to name of last item of (sort (files in (targetFolder)) by modification date)

display dialog "Hubitat backup is done: " & movedFile

end tell

end run

2 Likes