What is your wget command? When I run what you posted above it creates a file called “backupDB?fileName=latest” that is 2.4mb in size. It’s the right size to be a backup file, but the name is wrong.
When I go to that address through the browser it produces a time stamped file name.
Here's a batch file I run on a windows machine. Modify as you want. One thing you may need to tweak is the grabbing of the date info as the variable is different based on the locale settings on the machine.
The wget machine is at the bottom.
@echo off
pushd %~dp0
set hubaddress=hubitat.localdomain
set year=%DATE:~10,4%
set day=%DATE:~7,2%
set mnt=%DATE:~4,2%
set hr=%TIME:~0,2%
set min=%TIME:~3,2%
IF %day% LSS 10 SET day=0%day:~1,1%
IF %mnt% LSS 10 SET mnt=0%mnt:~1,1%
IF %hr% LSS 10 SET hr=0%hr:~1,1%
IF %min% LSS 10 SET min=0%min:~1,1%
set backuptime=Hubitat-%year%%mnt%%day%-%hr%%min%.lzf
.\wget\wget.exe -c --content-disposition -O .\%backuptime% http://%hubaddress%/hub/backupDB?fileName=latest
popd
Should anyone want to do this in Windows with Curl
Create a directory C:/hubBackup (or any name you want) to store the backups
Create a daily scheduled task with the following action,filling the optional "start in" field with, C:/hubBackup/ curl --url http://HubIP/hub/backupDB?fileName=latest -OJ
To delete backups over 10 days old
ForFiles /p "C:\hubBackup" /s /m *.lzf /d -10 /c "cmd /c del @file"
I wrote a linux shell script that works with rclone to upload the file to google drive and then delete the file from the local storage. It does not auto-delete, though. This is fine though, as even at relatively large backups (4 mb), it can hold 250 days using only 1 gb, leaving plenty of space for Gmail and your google drive files. The shell script should be kept in a directory called Hubitat_Backup and should be run through sudo crontab.
Here's my VB6 program to download from all 4 hubs, run daily at 3:45 am:
Option Explicit
Private Declare Function URLDownloadToFile Lib "urlmon" Alias "URLDownloadToFileA" _
(ByVal pCaller As Long, _
ByVal szURL As String, _
ByVal szFileName As String, _
ByVal dwReserved As Long, _
ByVal lpfnCB As Long) As Long
Sub Main()
Dim ret As Long
ret = URLDownloadToFile(0, "http://192.168.1.10/hub/backupDB?fileName=latest", "\\server\backups\Hubitat 0.lzf", 0, 0)
ret = URLDownloadToFile(0, "http://192.168.1.11/hub/backupDB?fileName=latest", "\\server\backups\Hubitat 1.lzf", 0, 0)
ret = URLDownloadToFile(0, "http://192.168.1.12/hub/backupDB?fileName=latest", "\\server\backups\Hubitat 2.lzf", 0, 0)
ret = URLDownloadToFile(0, "http://192.168.1.13/hub/backupDB?fileName=latest", "\\server\backups\Hubitat 3.lzf", 0, 0)
End Sub
@dorian.workman I originally tried to do this on my Mac. Since Macs use unix, all the commands work as long as you install clone and wget. The issue is that cron has been depreciated on Macs (since OS 10.4), and launchd feels a bit too complicated for me. If somebody else could help @dorian.workman with launchd, it might be possible.
This is awesome, thanks for sharing this.
I don't need daily backups, but just set this up for Weekly updates on Monday and ran a test run just now and it worked great!
I actually took it one step further and wrote a fully self-contained apple script to download the file and move it. Script below for anyone who may find it useful. Just save this as an application and trigger it from a recurring calendar event. Easy.
onrun
tellapplication "Safari"
tellwindow 1
set current tab to (make new tab with properties {URL:"http://HUB IP/hub/backupDB?fileName=latest"})
delay 5
close current tab
endtell
endtell
tellapplication "Finder"
set sourceFolder toalias "Macintosh SSD:Users:dworkman:Downloads:"
set targetFolder toalias "Macintosh SSD:Users:dworkman:Documents:Hubitat:"
set latestFile tolastitemof (sort (filesin (sourceFolder)) by modification date)
move latestFile to targetFolder
set movedFile to name oflastitemof (sort (filesin (targetFolder)) by modification date)
display dialog "Hubitat backup is done: " & movedFile
Script further enhanced to delete files older than 30 days and display notifications (I was bored today....)
on run
tell application "Safari"
tell window 1
set current tab to (make new tab with properties {URL:"http://HUBIP/hub/backupDB?fileName=latest"})
delay 5
close current tab
end tell
end tell
tell application "Finder"
set sourceFolder to alias "Macintosh SSD:Users:dworkman:Downloads:"
set targetFolder to alias "Macintosh SSD:Users:dworkman:Documents:Hubitat:"
set latestFile to last item of (sort (files in (sourceFolder)) by modification date)
move latestFile to targetFolder
set movedFile to name of last item of (sort (files in (targetFolder)) by modification date)
end tell
set modDate to (30)
tell application "System Events"
set fileList to (every item of targetFolder whose modification date is less than ((current date)) - modDate * days)
set listOfNames to {}
repeat with i in fileList
set currentFileName to (the name of i)
copy currentFileName to end of listOfNames
end repeat
set myString to items of listOfNames
set my text item delimiters to ", "
text items of listOfNames
set listSize to count of fileList
end tell
tell application "Finder"
if listSize is 0 then
display notification "Hubitat backup complete. File downloaded: " & movedFile & ". No files deleted." with title "Hubitat Backup" sound name "Blow"
else
repeat with i in fileList
delete i
end repeat
display notification "Hubitat backup complete. File downloaded: " & movedFile & ". File(s) deleted: " & myString & "." with title "Hubitat Backup" sound name "Blow"
delay 1
end if
end tell
end run
Can someone with more Linux / UNIX experience than me help me get my backups saved with the same name as the file on the Hubitat hub? I have messed around with some of the CRON scripts from various threads on the community and arrived at a shell script below that saves the backup with the date/time I download it, but I would prefer to take the name from the HE file itself. I'd also prefer to download the backup taken by the hub rather than taking a new one with the "latest" web call. Thanks.
I haven't actually run a restore but I just want to confirm that the file size when copying to Windows is only 1.37K? When I download them manually from the hub to my Mac they are 1.4MB. When I download to Windows they are less than 2K? And on the hub they report that they are 1403584 which I assume is bytes - making the Mac download the right size. Confusing.