How to create automatic backups of hub database

Here is another linux variant for getting the db using curl and wget.

curl -c cookie.txt -d username=USERNAME -d password=PASSWORD http://HUB-IP/login
wget --load-cookies=cookie.txt --content-disposition -P /YOUR_HOME/hubitat-backups/ http://HUB-IP/hub/backupDB?fileName=latest

My cron looks like this.
First line runs the cron at 4am and makes sure I'm in ~/ then retrieves the cookie and finally the backup db.

00 4 * * * cd; curl -c cookie.txt -d username=USERNAME -d password=PASSWORD http://HUB-IP/login; wget --load-cookies=cookie.txt --content-disposition -P /home/USERNAME/Dropbox/hubitat-backups/ http://HUB-IP/hub/backupDB?fileName=latest

This line deletes backups older then 5 days.

@daily find /home/USERNAME/Dropbox/hubitat-backups/ -mindepth 1 -type f -mtime +5 -delete

3 Likes

Could this be done in python or node red? I have both running in dockers and would love to get regular backups working.

check out https://curl.trillworks.com/

I use nodeRed for my daily backups.

The Get Backup node is an http request node performing a GET to

http://<hubip>/hub/backupDB?fileName=latest

and returning a binary buffer that is passed to the FileIn node to save to disk.

Can this work with a password? I assume that I could do reboots as well.

I would assume so but dealing with cookies is beyond my skillset.
@dan.t , @corerootedxb or @btk may possess the knowledge you require. I am but a mere fledgling in the ways of the Node.

See this thread for an awesome setup that @btk did. It does reboots and so much more. Of course, you would also need to include something to deal with passwords for this to work.

Is anyone having troubles with the backup download taking a long time? I am using the above mentioned url:
http://HUB-IP-ADDRESS/hub/backupDB?fileName=latest

I have 3 hubs and using NodeRed to download the latest backups every morning and one of my hubs is taking close to 3 minutes to present itself to NodeRed thus timing out. My other two hubs take less than 30 seconds. If I reboot this hub it works faster. This hub does have a larger backup because it is my "coordinator" hub.

From a size comparison:
Trouble Hub Backup Size: 10 MB
Hub 2 that works: 5 MB
Hub 3 that works: 4.4MB

I am testing the node-red-contrib-http-request-ucg node where I can set a timeout value to see if it will at least work.

@patrick any thoughts?

I just backed up 4 hubs in a minute or so...

#1 is 9.0mb
#2 is 7.7mb
#3 is 16.0mb
#4 is 1.2mb

using the same URL you are.

wow.. those backup file size are small.

My db is usually around 42 mb.

On my 3 hubs they are all 5-15 MB.

Maybe you have something excessively logging for some reason (or no reason at all) :slight_smile:

@ritchierich: couldn't this be just a networking issue?

yeah.. what's excessive? any metrics to follow? Only thing I can think off is to check to see how many entries are in the logs for past logs.

If you keep your old backups, I would check back through older backups to see if they grew gradually or suddenly. ????

they are all about the same size... I have it pull the backup daily. They all hover around 40-42 mb

2 Likes

Ok, well if you look in past logs you can see if there are any useless entries... I tend to keep things clear so I disable logging on none crucial things if they run well (after a week of testing or so). And if you have a lot of sensors there will always be a lot of data of course no doubt about it and no reason to make that any less.

Yeah its just this one hub and it gets longer over time. If I reboot the hub it downloads quickly. If I navigate to the settings/download page I can download it quickly there too, its just that URL becomes slower over time.

I don't think so because all hubs are hooked up to the same switch and again if I reboot the hub it downloads much quicker.

Do you speak gently with encouraging words to your hub or do you get cranky and sware? It can make a difference? :wink:

1 Like

+1 on the request to include an FTP server on the hub to make backing up easier

Backing up can't be easier. FTP is just a protocol of for the transfer. HTTP, the protocol in use today, is quite adequate.

The method to back up the DB is well documented around here, one for each OS type... Linux, Windows, OSX. Recipes for all of them are available in this thread and others. :slight_smile:

-1 from me for FTP... instead +1 for fixing the dang slow down/crash/DB corruption problem. :smiley:

7 Likes

I did this pretty darn easy with a scheduled task.

Download WGET for windows:
https://eternallybored.org/misc/wget/
Get the EXE (32 or 64) for the latest version.
Put the EXE in C:\wget\

Created a scheduled task in task manager:
C:\wget\wget.exe --content-disposition -P "C:\hubitat backups" http://[YourHubIP]/hub/backupDB?fileName=latest

Set the time for whenever you want it to run daily. Done. No scripting, no magic, one command.

3 Likes