Automatic Hubitat Backup to Synology Nas without PC

Hi guys,

anyone know how to do hubitat backup to Synology Nas without PC?

thanks

If you have access to cron on the NAS, see the instructions in this post for an example (click on the link to see the post itself--the summary below is all messed up):

https://community.hubitat.com/t/hub-protection-service-coming-soon/45408/29?u=rsjaffe

thank you so much for the info. will try it out.

I've been meaning to set this up. I have a DS918+; not sure if the UI is the same on yours.

Open Control Panel, select Task Scheduler, then click Create, then Scheduled Task -> User defined script.

The default schedule is once a day at midnight. Change that if you want.

The actual script text is:

cd /volume1/computer_backups/Hubitat
/usr/bin/wget http://10.0.0.102/hub/backupDB?fileName=latest --backups=7
/bin/cp backupDB\?fileName\=latest hubitat_$(date +\%Y-\%m-\%d_\%T).lzf

Of course, you'll need to specify the proper directory name and the IP address of your hub.

Also, this goes without saying but your NAS will need to have network access to your HE. Normally I have all my automation and IOT stuff on a separate subnet and WIfi AP, so rogue devices can't access my computers or NAS.

For this to work though, I have to have HE on the same subnet. The NAS has a firewall, so I might try playing with that to protect it from a HE that went over to the dark side. Probably unlikely, but I don't like taking chances.

Hope this helps.

Edit: One more thing - if you haven't already, go into your switch or router and set a DHCP reservation for your HE hub, so it effectively has a static IP address.

2 Likes

From what I've read, cron on a Synology NAS is quirky. Fortunately, there's a Task Scheduler UI that lets you add user-defined tasks. Probably uses cron under the hood.

Hi ,

thanks a lot for the detailed instructions. Will try your method and let you know.

You can also use NodeRed to do it leveraging the docker container app - know Synology has it but I run a QNAP so not sure itโ€™s name. Thatโ€™s what I use to download and store the files along with a bunch of other things like Grafana charts of temperatures, etc.

i tried to run it but nothing happens, no email received as well. I wonder what did i do wrong

Untitled

for qnap i created the following file on windows and ftp it to qnap.
then chmod 755 on it
then dos2unix on it
then test it and if it works (it did)
put it in crontab this way

echo "0 6 * * * /share/qnapshared/hubitat/getbackup.sh" >> /etc/config/crontab

here is the file

[/share/qnapshared/hubitat] # cat getbackup.sh
#!/bin/sh

cd /share/qnapshared/hubitat
/usr/bin/wget http://192.168.11.109/hub/backupDB?fileName=latest --backups=7
/bin/cp backupDB?fileName=latest hubitat_$(date +%Y%m%d).lzf

[/share/qnapshared/hubitat] #

2 Likes

Great post, @dcaton1220

Coincidentally I set this up early today from @rsjaffe's post from months ago, linked to above.

It works great, but just to forewarn others, you will get some odd file names from the daily backup.

I set up two tasks in Synology - the daily backup, and the weekly copy so I could control them separately. As noted in the other thread:

That [the daily task] backs up the database daily and keeps 7 days' worth of backups. Weekly, [the weekly task] copies the latest backup to a file named with the date, and keeps those weekly backups indefinitely.

My first Daily backup file name looked like this!

But the Weekly backup gets a proper name:

1 Like

its working now thanks.

Two ways you can debug. Go into Control Panel, click Task Scheduler and hIghlight the task.

First method, click on Action, then view result. The output of the script is in the "Standard output/error" box. Unfortunately the dialog isn't resizable.

Second method, highlight the task and click on Settings. You should see this:

Check Save output results and specify a folder. A folder named "synoscheduler" will be created in the specified folder. Drill down a few folders and you'll find output.log and script.log files for each run of the task. This should show you the commands that are being executed and whatever is going wrong. You can open the .log file in any text editor but you may have to copy it to your local machine.

You can also run the script commands interactively if you log into your NAS using SSH. Putty is a good tool for this. If you use this, make sure you log in as root so you are running your commands in the same context as the Task Scheduled does (assuming you left the User field set to root in the task settings (on the General tab). If you log into the NAS using your normal credentials, you can gain root access by typing su in the command window and then entering the root password.

1 Like

I got it working after I changed the user to โ€œrootโ€.
Thank you so much for your help

1 Like

Excellent! Thanks for this @dcaton1220

For anyone having issues, please check that the folder path exists with the proper permissions on volume1. I am running it as admin with success.

1 Like

If I have configured security on the Hub so that I have to log in to access it what needs to be updated in this script to authenticate to the Hub before requesting the backup file?