How Hubitat Backup/Restore has Saved My Ass

What is your wget command? When I run what you posted above it creates a file called “backupDB?fileName=latest” that is 2.4mb in size. It’s the right size to be a backup file, but the name is wrong.

When I go to that address through the browser it produces a time stamped file name.

Here's a batch file I run on a windows machine. Modify as you want. One thing you may need to tweak is the grabbing of the date info as the variable is different based on the locale settings on the machine.
The wget machine is at the bottom.

@echo off
pushd %~dp0
set hubaddress=hubitat.localdomain
set year=%DATE:~10,4%
set day=%DATE:~7,2%
set mnt=%DATE:~4,2%
set hr=%TIME:~0,2%
set min=%TIME:~3,2%
IF %day% LSS 10 SET day=0%day:~1,1%
IF %mnt% LSS 10 SET mnt=0%mnt:~1,1%
IF %hr% LSS 10 SET hr=0%hr:~1,1%
IF %min% LSS 10 SET min=0%min:~1,1%
set backuptime=Hubitat-%year%%mnt%%day%-%hr%%min%.lzf
.\wget\wget.exe -c --content-disposition -O .\%backuptime% http://%hubaddress%/hub/backupDB?fileName=latest
popd
2 Likes

Thanks

For anyone that is interested: This is the command I put in my crontab on my Ubuntu server:

00 4 * * * wget -O- http://192.168.20.20/hub/backupDB?fileName=latest >>/path/to/backup/folder/$(date -d "today" +"\%Y-\%m-\%d_\%H:\%M").lzf

That generates a file at 4am everyday and time stamps it.

4 Likes

Even simpler:

00 4 * * * wget --content-disposition -P /path/to/backup/folder/ http://192.168.20.20/hub/backupDB?fileName=latest

And this uses the fileName hubitat gives it, which has the firmware version appended also.

8 Likes

And if you want to remove files older then X days I use this in my crontab

@daily find /path/to/backup/folder/ -mindepth 1 -type f -mtime +X -delete

where X is the number of days old you want to remove -- I use 5

2 Likes

Should anyone want to do this in Windows with Curl

  1. Create a directory C:/hubBackup (or any name you want) to store the backups

  2. Create a daily scheduled task with the following action,filling the optional "start in" field with, C:/hubBackup/
    curl --url http://HubIP/hub/backupDB?fileName=latest -OJ

  3. To delete backups over 10 days old
    ForFiles /p "C:\hubBackup" /s /m *.lzf /d -10 /c "cmd /c del @file"

Note: matching files names are NOT overwritten

7 Likes

I wrote a linux shell script that works with rclone to upload the file to google drive and then delete the file from the local storage. It does not auto-delete, though. This is fine though, as even at relatively large backups (4 mb), it can hold 250 days using only 1 gb, leaving plenty of space for Gmail and your google drive files. The shell script should be kept in a directory called Hubitat_Backup and should be run through sudo crontab.

wget http://HubIP/hub/backupDB?fileName=latest
sudo cp /home/pi/Hubitat_Backup/'backupDB?fileName=latest' /home/pi/Hubitat_Backup/$(date +%m-%d-%Y.lzf) 2> /dev/null
sudo cp /home/pi/Hubitat_Backup/'backupDB?fileName=latest.1' /home/pi/Hubitat_Backup/$(date +%m-%d-%Y.lzf) 2> /dev/null
sudo rm /home/pi/Hubitat_Backup/*backupDB?fileName=latest*
sudo rclone copy /home/pi/Hubitat_Backup/*.lzf Google_Drive:Hubitat_Backups/
sudo rclone delete Google_Drive:Hubitat_Backups --min-age 30d
sleep 1s
sudo rm /home/pi/Hubitat_Backup/*.lzf

Make sure to change "pi" to your machine's home directory

Here's my VB6 program to download from all 4 hubs, run daily at 3:45 am:

Option Explicit

Private Declare Function URLDownloadToFile Lib "urlmon" Alias "URLDownloadToFileA" _
 (ByVal pCaller As Long, _
 ByVal szURL As String, _
 ByVal szFileName As String, _
 ByVal dwReserved As Long, _
 ByVal lpfnCB As Long) As Long

Sub Main()
    Dim ret As Long

    ret = URLDownloadToFile(0, "http://192.168.1.10/hub/backupDB?fileName=latest", "\\server\backups\Hubitat 0.lzf", 0, 0)
    ret = URLDownloadToFile(0, "http://192.168.1.11/hub/backupDB?fileName=latest", "\\server\backups\Hubitat 1.lzf", 0, 0)
    ret = URLDownloadToFile(0, "http://192.168.1.12/hub/backupDB?fileName=latest", "\\server\backups\Hubitat 2.lzf", 0, 0)
    ret = URLDownloadToFile(0, "http://192.168.1.13/hub/backupDB?fileName=latest", "\\server\backups\Hubitat 3.lzf", 0, 0)
End Sub

I have edited my program to auto-delete the files from google drive after 30d

Hi all

I'm not a programmer. Can anyone tell me how to setup something similar on my Mac?

Thanks.

EDIT: Don't worry I can do is using Automator. All good!

@dorian.workman I originally tried to do this on my Mac. Since Macs use unix, all the commands work as long as you install clone and wget. The issue is that cron has been depreciated on Macs (since OS 10.4), and launchd feels a bit too complicated for me. If somebody else could help @dorian.workman with launchd, it might be possible.

Thanks for the reply. I actually got it working using a calendar-triggered automator workflow that includes a shell script to move the file.

@dorian.workman I didn't think of that :grinning:! If you want to upload straight to google drive, you can et up clone and use that to move the file

No need, thanks. I just auto-download the file and then auto-move it from my Downloads folder to Documents. Documents is on my iCloud Drive already.

1 Like

This is awesome, thanks for sharing this.
I don't need daily backups, but just set this up for Weekly updates on Monday and ran a test run just now and it worked great!

Thank you!

1 Like

I actually took it one step further and wrote a fully self-contained apple script to download the file and move it. Script below for anyone who may find it useful. Just save this as an application and trigger it from a recurring calendar event. Easy.

on run

tell application "Safari"

tell window 1

set current tab to (make new tab with properties {URL:"http://HUB IP/hub/backupDB?fileName=latest"})

delay 5

close current tab

end tell

end tell

tell application "Finder"

set sourceFolder to alias "Macintosh SSD:Users:dworkman:Downloads:"

set targetFolder to alias "Macintosh SSD:Users:dworkman:Documents:Hubitat:"

set latestFile to last item of (sort (files in (sourceFolder)) by modification date)

move latestFile to targetFolder

set movedFile to name of last item of (sort (files in (targetFolder)) by modification date)

display dialog "Hubitat backup is done: " & movedFile

end tell

end run

2 Likes

We've had a few shell scripts and batch files posted here, but I believe this is the first AppleScript posted on the Hubitat community!

3 Likes

Script further enhanced to delete files older than 30 days and display notifications (I was bored today....)

on run
tell application "Safari"
	tell window 1
		set current tab to (make new tab with properties {URL:"http://HUBIP/hub/backupDB?fileName=latest"})
		delay 5
		close current tab
	end tell
end tell
tell application "Finder"
	set sourceFolder to alias "Macintosh SSD:Users:dworkman:Downloads:"
	set targetFolder to alias "Macintosh SSD:Users:dworkman:Documents:Hubitat:"
	set latestFile to last item of (sort (files in (sourceFolder)) by modification date)
	move latestFile to targetFolder
	set movedFile to name of last item of (sort (files in (targetFolder)) by modification date)
end tell

set modDate to (30)

tell application "System Events"
	set fileList to (every item of targetFolder whose modification date is less than ((current date)) - modDate * days)
	set listOfNames to {}
	repeat with i in fileList
		set currentFileName to (the name of i)
		copy currentFileName to end of listOfNames
	end repeat
	set myString to items of listOfNames
	set my text item delimiters to ", "
	text items of listOfNames
	set listSize to count of fileList
end tell


tell application "Finder"
	if listSize is 0 then
		display notification "Hubitat backup complete. File downloaded: " & movedFile & ". No files deleted." with title "Hubitat Backup" sound name "Blow"
	else
		repeat with i in fileList
			delete i
		end repeat
		display notification "Hubitat backup complete. File downloaded: " & movedFile & ". File(s) deleted: " & myString & "." with title "Hubitat Backup" sound name "Blow"
		delay 1
	end if
end tell
end run
1 Like

Can someone with more Linux / UNIX experience than me help me get my backups saved with the same name as the file on the Hubitat hub? I have messed around with some of the CRON scripts from various threads on the community and arrived at a shell script below that saves the backup with the date/time I download it, but I would prefer to take the name from the HE file itself. I'd also prefer to download the backup taken by the hub rather than taking a new one with the "latest" web call. Thanks.

#!/bin/bash
# Script to download hubitat backup

backupdir='/home/pi/heBackups/c4'
he_ipaddr=192.168.0.8

cd $backupdir
ls -1tr | head -n -14 | xargs -d '\n' rm -f --

curl http://$he_ipaddr/hub/backup -o $backupdir/C4_$(date +%Y-%m-%d-%H%M).lzf | grep download | awk -F"=" '{ print $5}' | awk -F'"' '{print $2}' | sed '/^$/d' | tail -1 | xargs -I @curl http://$he_ipaddr/hub//backupDB?fileName=@

I haven't actually run a restore but I just want to confirm that the file size when copying to Windows is only 1.37K? When I download them manually from the hub to my Mac they are 1.4MB. When I download to Windows they are less than 2K? And on the hub they report that they are 1403584 which I assume is bytes - making the Mac download the right size. Confusing.