I've spent many hours developing my own code on Hubitat and wrote a task on my server to download a backup from my Hubitat hub once a day.
Its happened a few times now where I accidentally messed up code on my hub that I put too much time in.
I was able to get it all back just by downloading a new backup, restoring a previous backup, grabbing the code in question and then restoring the latest backup again to copy the code back in.
care to share how you have your server setup to do it? Just a shell script or something more involved. I have an ubuntu server I would love to automate my backups like yours/.
What is your wget command? When I run what you posted above it creates a file called “backupDB?fileName=latest” that is 2.4mb in size. It’s the right size to be a backup file, but the name is wrong.
When I go to that address through the browser it produces a time stamped file name.
Here's a batch file I run on a windows machine. Modify as you want. One thing you may need to tweak is the grabbing of the date info as the variable is different based on the locale settings on the machine.
The wget machine is at the bottom.
@echo off
pushd %~dp0
set hubaddress=hubitat.localdomain
set year=%DATE:~10,4%
set day=%DATE:~7,2%
set mnt=%DATE:~4,2%
set hr=%TIME:~0,2%
set min=%TIME:~3,2%
IF %day% LSS 10 SET day=0%day:~1,1%
IF %mnt% LSS 10 SET mnt=0%mnt:~1,1%
IF %hr% LSS 10 SET hr=0%hr:~1,1%
IF %min% LSS 10 SET min=0%min:~1,1%
set backuptime=Hubitat-%year%%mnt%%day%-%hr%%min%.lzf
.\wget\wget.exe -c --content-disposition -O .\%backuptime% http://%hubaddress%/hub/backupDB?fileName=latest
popd
Should anyone want to do this in Windows with Curl
Create a directory C:/hubBackup (or any name you want) to store the backups
Create a daily scheduled task with the following action,filling the optional "start in" field with, C:/hubBackup/ curl --url http://HubIP/hub/backupDB?fileName=latest -OJ
To delete backups over 10 days old
ForFiles /p "C:\hubBackup" /s /m *.lzf /d -10 /c "cmd /c del @file"
I wrote a linux shell script that works with rclone to upload the file to google drive and then delete the file from the local storage. It does not auto-delete, though. This is fine though, as even at relatively large backups (4 mb), it can hold 250 days using only 1 gb, leaving plenty of space for Gmail and your google drive files. The shell script should be kept in a directory called Hubitat_Backup and should be run through sudo crontab.
Here's my VB6 program to download from all 4 hubs, run daily at 3:45 am:
Option Explicit
Private Declare Function URLDownloadToFile Lib "urlmon" Alias "URLDownloadToFileA" _
(ByVal pCaller As Long, _
ByVal szURL As String, _
ByVal szFileName As String, _
ByVal dwReserved As Long, _
ByVal lpfnCB As Long) As Long
Sub Main()
Dim ret As Long
ret = URLDownloadToFile(0, "http://192.168.1.10/hub/backupDB?fileName=latest", "\\server\backups\Hubitat 0.lzf", 0, 0)
ret = URLDownloadToFile(0, "http://192.168.1.11/hub/backupDB?fileName=latest", "\\server\backups\Hubitat 1.lzf", 0, 0)
ret = URLDownloadToFile(0, "http://192.168.1.12/hub/backupDB?fileName=latest", "\\server\backups\Hubitat 2.lzf", 0, 0)
ret = URLDownloadToFile(0, "http://192.168.1.13/hub/backupDB?fileName=latest", "\\server\backups\Hubitat 3.lzf", 0, 0)
End Sub
@dorian.workman I originally tried to do this on my Mac. Since Macs use unix, all the commands work as long as you install clone and wget. The issue is that cron has been depreciated on Macs (since OS 10.4), and launchd feels a bit too complicated for me. If somebody else could help @dorian.workman with launchd, it might be possible.
This is awesome, thanks for sharing this.
I don't need daily backups, but just set this up for Weekly updates on Monday and ran a test run just now and it worked great!
I actually took it one step further and wrote a fully self-contained apple script to download the file and move it. Script below for anyone who may find it useful. Just save this as an application and trigger it from a recurring calendar event. Easy.
onrun
tellapplication "Safari"
tellwindow 1
set current tab to (make new tab with properties {URL:"http://HUB IP/hub/backupDB?fileName=latest"})
delay 5
close current tab
endtell
endtell
tellapplication "Finder"
set sourceFolder toalias "Macintosh SSD:Users:dworkman:Downloads:"
set targetFolder toalias "Macintosh SSD:Users:dworkman:Documents:Hubitat:"
set latestFile tolastitemof (sort (filesin (sourceFolder)) by modification date)
move latestFile to targetFolder
set movedFile to name oflastitemof (sort (filesin (targetFolder)) by modification date)
display dialog "Hubitat backup is done: " & movedFile