How to create automatic backups of hub database

yeah.. what's excessive? any metrics to follow? Only thing I can think off is to check to see how many entries are in the logs for past logs.

If you keep your old backups, I would check back through older backups to see if they grew gradually or suddenly. ????

they are all about the same size... I have it pull the backup daily. They all hover around 40-42 mb

2 Likes

Ok, well if you look in past logs you can see if there are any useless entries... I tend to keep things clear so I disable logging on none crucial things if they run well (after a week of testing or so). And if you have a lot of sensors there will always be a lot of data of course no doubt about it and no reason to make that any less.

Yeah its just this one hub and it gets longer over time. If I reboot the hub it downloads quickly. If I navigate to the settings/download page I can download it quickly there too, its just that URL becomes slower over time.

I don't think so because all hubs are hooked up to the same switch and again if I reboot the hub it downloads much quicker.

Do you speak gently with encouraging words to your hub or do you get cranky and sware? It can make a difference? :wink:

1 Like

+1 on the request to include an FTP server on the hub to make backing up easier

Backing up can't be easier. FTP is just a protocol of for the transfer. HTTP, the protocol in use today, is quite adequate.

The method to back up the DB is well documented around here, one for each OS type... Linux, Windows, OSX. Recipes for all of them are available in this thread and others. :slight_smile:

-1 from me for FTP... instead +1 for fixing the dang slow down/crash/DB corruption problem. :smiley:

7 Likes

I did this pretty darn easy with a scheduled task.

Download WGET for windows:
https://eternallybored.org/misc/wget/
Get the EXE (32 or 64) for the latest version.
Put the EXE in C:\wget\

Created a scheduled task in task manager:
C:\wget\wget.exe --content-disposition -P "C:\hubitat backups" http://[YourHubIP]/hub/backupDB?fileName=latest

Set the time for whenever you want it to run daily. Done. No scripting, no magic, one command.

3 Likes

That's a great solution.
Thanks for sharing that in such a clear way. I'll be implementing this ASA.P.
Cheers,
Nic

@gparra I've been using your script for a while now, and it was working great up until a couple of weeks ago. For some reason, the retention policy is no longer working. Or, rather, it's working oddly. I had the retention policy set to 15. If there are 15 files in the folder, the new backup is not being saved. If I delete one, then the new backup does get saved, but no older ones are deleted. Hope that made sense. I have no experience with PowerShell scripts so I'm hoping someone can point me in the right direction.

I put together my own Windows .BAT batch script to grab the latest backup and delete backups older than 90 days. I use Task Scheduler to run this once per week. It does use wget referenced above.

@echo off
cls
echo ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
echo   Grabbing backup from Hubitat...
echo ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
"D:\path\to\wget\wget.exe" --content-disposition -P "D:\path\to\save\folder" http://[hub IP    address]/hub/backupDB?fileName=latest
echo Done.
echo.
echo ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
echo   Deleting older backups...
echo ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
echo Deleting backups older than 90 days...
ForFiles /p "D:\path\to\save\folder" /M *.lzf /d -90 /c "cmd /c del /q @file" 2>&1 | find /v /i "ERROR: No files found with the specified search criteria."
echo Done.
echo.
timeout 5

I know the thread is old, but in case someone needs -
There is an easy way to get your backups which are already created by the system.
Here is a small bash script to get latest already created backup from Hubitat: Download latest Hubitat backup using API ยท GitHub