URL to Download Latest Maintenance Backup

For the last 5 years I have a NodeRed flow downloading the latest backup from my 5 hubs using the following URL every morning at 3:30 AM:
http://IP-ADDRESS/hub/backupDB?fileName=latest

All is good but I have been seeing posts about memory leaks associated with backups and have noticed my hub's free memory falls over time especially around the maintenance window for database cleanup and automated backup. Suggestions to some have been to reduce the frequency of the backup which I could probably do.

This said using the above URL produces another backup vs downloading the "latest" automated backup the hub creates during its nightly maintenance window. Example where my database cleanup is set for 2 AM and NodeRed downloads the backup at 3:30:

@gopher.ny is there a URL to download the latest maint/database cleanup backup instead of creating a new one? Would downloading this backup help with the memory situation that folks have been posting here on the Community?

1 Like

I believe the memory dip is just from the backup job running. Those normally recover most memory directly after. My personal approach was to disable the automated backups from the system settings and just use the scripted method (mine was PowerShell from a Windows box, but same approach).

1 Like

Interesting idea. Thanks for the input.

@gopher.ny if we are downloading backups via external system would it be best to set the “Local Backup Schedule” to disabled?

That or set it on low frequency. Having two backups per day doesn't make much sense.

Agree but is there a URL to download the latest automated backup? The URL in the OP always creates a new one vs downloading the “latest” as indicated in the query parameter.

1 Like

This is the shell script I run on a rpi each morning 2 hours after the scheduled backup and it doesn't appear to create a new backup on the HE hub.... Don't ask me to try and remember what each part does :slight_smile: My guess is it grabs the list of local backups from the web page and finds the most recent one, then downloads it. The bit at the beginning keeps 14 days worth of backup files on the rpi.

heName='HERULES03'
heAddr='192.168.x.x'
backupdir='/data/backups/hubitat/HERULES03'
cd $backupdir
ls -1tr | head -n -14 | xargs -d '\n' rm -f --

curl http://$heAddr/hub/backup | grep download | awk -F"=" '{ print $4}' | awk -F'"' '{print $2}' | sed '/^$/d' | tail -1 | xargs -I @ curl http://$heAddr/hub//backupDB?fileName=@ -o $backupdir/HERULES03_$(date +%Y-%m-%d-%H%M).lzf

There was a thread from a while ago where various alternative scripts were discussed, including some powershell one's I think...

2 Likes

Just following up to close the loop on this thread. @gopher.ny has added an API to download the latest backup available that will not generate a new one:
HUB-IP-ADDRESS/hub/advanced/latestBackupFile

@bobbyD feel free to close out this thread.

1 Like

The link you shared is not available to non-beta users, so I replaced it with a screenshot.

1 Like

Download the Hubitat app