Node-Red for daily Hub backups to external storage device

This is the command I'm running here to back up the directory local, then you could set up a cron job on the NAS to pick up the file?

sudo zip -rq /PATH TO BACKUPS/node-red-pi4-backup-$(date +%Y-%m-%d).zip /PATH TO NODE RED/.node-red

I'm then using Resilio sync to copy the backup to my PC

image
It keeps the last 7 days.

Full Flow, this will create the backup on the PI and then you just need to sort something to move it. Its set at 22:00 each day.

[{"id":"2d1a5950.d2c6e6","type":"tab","label":"Flow 1","disabled":false,"info":""},{"id":"aae1832f.83332","type":"exec","z":"2d1a5950.d2c6e6","command":"sudo zip -rq /**PATH TO BACKUPS**/node-red-pi4-backup-$(date +%Y-%m-%d).zip /**PATH TO NODE RED**/.node-red","addpay":false,"append":"","useSpawn":"false","timer":"","oldrc":false,"name":"backup .node-red","x":410,"y":160,"wires":[["e639f00f.06ee1"],["e639f00f.06ee1"],["70c0e369.8b18dc"]]},{"id":"2381e532.f66a0a","type":"inject","z":"2d1a5950.d2c6e6","name":"daily @2200","props":[{"p":"payload"},{"p":"topic","vt":"str"}],"repeat":"","crontab":"00 22 * * *","once":false,"onceDelay":0.1,"topic":"","payload":"","payloadType":"date","x":180,"y":160,"wires":[["aae1832f.83332"]]},{"id":"e73c88dc.d21868","type":"debug","z":"2d1a5950.d2c6e6","name":"backup-log","active":false,"tosidebar":true,"console":false,"tostatus":false,"complete":"payload","targetType":"msg","statusVal":"","statusType":"auto","x":990,"y":240,"wires":[]},{"id":"70c0e369.8b18dc","type":"switch","z":"2d1a5950.d2c6e6","name":"check op success","property":"payload.code","propertyType":"msg","rules":[{"t":"eq","v":"0","vt":"num"},{"t":"else"}],"checkall":"true","repair":false,"outputs":2,"x":230,"y":300,"wires":[["a272f3c.abf7a1"],["99b2ffc7.106ac"]]},{"id":"a272f3c.abf7a1","type":"exec","z":"2d1a5950.d2c6e6","command":"sudo find /**PATH TO BACKUPS**/*.zip -mtime +7 -type f -delete","addpay":false,"append":"","useSpawn":"false","timer":"","oldrc":false,"name":"delete older than 7 days","x":470,"y":260,"wires":[["e639f00f.06ee1"],["e639f00f.06ee1"],["14068310.122dad"]]},{"id":"99b2ffc7.106ac","type":"change","z":"2d1a5950.d2c6e6","name":"custom error msg","rules":[{"t":"set","p":"payload","pt":"msg","to":"❌ There was an error in Pi4 backup flow. Check logs.","tot":"str"}],"action":"","property":"","from":"","to":"","reg":false,"x":570,"y":380,"wires":[["e73c88dc.d21868","7ee45bd3.4948e4"]]},{"id":"14068310.122dad","type":"switch","z":"2d1a5950.d2c6e6","name":"check op success","property":"payload.code","propertyType":"msg","rules":[{"t":"eq","v":"0","vt":"num"},{"t":"else"}],"checkall":"true","repair":false,"outputs":2,"x":490,"y":320,"wires":[["71b9f688.7a0718"],["99b2ffc7.106ac"]]},{"id":"71b9f688.7a0718","type":"template","z":"2d1a5950.d2c6e6","name":"success msg","field":"payload","fieldType":"msg","format":"handlebars","syntax":"mustache","template":"✅ RPi4 Backup compelete.","output":"str","x":750,"y":240,"wires":[["e73c88dc.d21868","7ee45bd3.4948e4"]]},{"id":"e639f00f.06ee1","type":"switch","z":"2d1a5950.d2c6e6","name":"filter null","property":"payload","propertyType":"msg","rules":[{"t":"nnull"},{"t":"else"}],"checkall":"true","repair":false,"outputs":2,"x":660,"y":160,"wires":[["e73c88dc.d21868"],[]]},{"id":"7ee45bd3.4948e4","type":"pushover api","z":"2d1a5950.d2c6e6","keys":"54ad51a1.76db6","title":"","name":"","x":960,"y":380,"wires":[]},{"id":"54ad51a1.76db6","type":"pushover-keys","z":"","name":"Pushover"}]
1 Like

Ah ok, so I read that completely wrong, but left the reply for when you want to backup NR too :slight_smile:

I have more than one hub and setup a cron job on my NAS.

Running this would generate a backup now to the storage and from the hub IP.

wget --content-disposition -P /PATH_TO_NAS_STORAGE/ http://YOUR_HUB_IP/hub/backupDB?fileName=latest

I have a QNAP so echo out to the required place.

echo "00 10 * * * wget --content-disposition -P /PATH_TO_NAS_STORAGE/ http://YOUR_HUB_IP/hub/backupDB?fileName=latest" >> /etc/config/crontab

Alter time accordingly (obviously) :slight_smile:

2 Likes

Thank you, but actually I was thinking about using Node-Red to automate the backup of the Hubitat hub I have. Sorry that wasn't clear.

I have Node-Red running on a Unraid server that already does daily backups of it.

I guess I could just create a user script on Unraid to do the backup.

1 Like

Yeah I read it all wrong lol. But my second post could help achieve that.
Glad I read this post though, as you've reminded my to setup my C7 into the backups :+1:

1 Like

Actually it did. I already had Unraid and it's User Scripts plug in loaded. I just needed to take the wget command and create a quick script from it. Thanks for that. It is already up and working.

1 Like

Here is a link to the solution that I had posted a while back. I've tweaked it some, but all the basic logic is described in there. Let me know if you have any questions

This will do it too.

[{"id":"ede3408f.c2d5","type":"inject","z":"8d7c327.e4a85d","name":"4:32am Daily","props":[{"p":"payload"},{"p":"topic","vt":"str"}],"repeat":"","crontab":"32 04 * * *","once":false,"onceDelay":0.1,"topic":"time","payload":"","payloadType":"date","x":480,"y":140,"wires":[["dd6c1136.1bec3"]]},{"id":"dd6c1136.1bec3","type":"http request","z":"8d7c327.e4a85d","name":"Download Backup","method":"GET","ret":"bin","paytoqs":"ignore","url":"http://192.168.1.100/hub/backupDB?fileName=latest","tls":"","persist":false,"proxy":"","authType":"","x":710,"y":140,"wires":[["7b8f5373.9df03c"]]},{"id":"9180c942.8d8048","type":"file","z":"8d7c327.e4a85d","name":"Backup File","filename":"","appendNewline":false,"createDir":true,"overwriteFile":"false","encoding":"none","x":1130,"y":140,"wires":[[]]},{"id":"212e82dc.63cb6e","type":"comment","z":"8d7c327.e4a85d","name":"Coordinator Hub","info":"","x":280,"y":140,"wires":[]},{"id":"cc2e4ae6.820538","type":"comment","z":"8d7c327.e4a85d","name":"DO NOT run these flows more than once per day, it will append the second file to the first, making it useless","info":"","x":700,"y":180,"wires":[]},{"id":"7b8f5373.9df03c","type":"string","z":"8d7c327.e4a85d","name":"","methods":[{"name":"strip","params":[{"type":"str","value":"attachment; filename="}]},{"name":"prepend","params":[{"type":"str","value":"/mnt/hubitat/coordinator/"}]}],"prop":"headers.content-disposition","propout":"filename","object":"msg","objectout":"msg","x":910,"y":140,"wires":[["9180c942.8d8048"]]}]

Change the time in the first node.
Change the IP address of HE in the second node.
Change the mount point in the 3rd node. My mount point links to my Synology NAS.

Thanks for that. I have imported that into my node red so i can take a closer look at it later. I think the option to just run a automated job on my unraid server is fine for now though.

Thanks! This helped me out with trying to figure out a solution for backing up my node-red install. Couple of quick questions:

  1. Have you tried to restore from the backup on a different machine and does it work?
  2. On a Mac, I had to remove the "sudo" in the exec node as it would error out (sudo prompts for password). I'm assuming that is not the case on a Pi (I installed Node-RED on the Homebridge pre-configured image in case that matters)?
1 Like

I've yet to test the restore part, but don't foresee it being an issue, as I have the whole directory.
Sadly, I've no idea on a MAC, but I suspect as I'm running it locally its not an issue.

1 Like

It worked perfectly (without "sudo") on a Mac (my test platform) but failed on the Pi (my "live" platform) :frowning_face:

With or without "sudo" in the command, I get the following - Command failed: zip -rq /home/pi/Documents/Nodered_Backup/2021-01-13-Nodered_Backup.zip /home/pi/.node-red/bin/bash: zip: command not found

Any idea what could be causing this? The payload that I am passing to the Exec node is sudo zip -rq /home/pi/Documents/Nodered_Backup/2021-01-13-Nodered_Backup.zip /home/pi/.node-red

Probably not in your path so try and run "/usr/bin/zip"?

1 Like

Unfortunately, that did not work either. I've posted on the Homebridge Discourse channel, so maybe I will get a response there..

1 Like

You may just need to use a different compression tool or try loading gzip onto your MAC

2 Likes

Mmm. I bet things are sandboxed / running in a chroot.

1 Like

Are you having the problem on the MAC or the Raspberry Pi. If a pi what os are you running. You can probably just modify the command to use tar instead.

1 Like

It's on the Pi (Mac works without "sudo"). It's the pre-configured Homebridge Raspbian image:
PRETTY_NAME="Raspbian GNU/Linux 10 (buster)" NAME="Raspbian GNU/Linux" VERSION_ID="10" VERSION="10 (buster)" VERSION_CODENAME=buster ID=raspbian ID_LIKE=debian HOME_URL="http://www.raspbian.org/" SUPPORT_URL="http://www.raspbian.org/RaspbianForums" BUG_REPORT_URL="http://www.raspbian.org/RaspbianBugs"

Yeah - that's what I'm thinking. There is a way to give Homebridge user "sudo" privileges so I may try that.

Also allot of images like that only include what they need for the prebuilt package they are providing. If that is the case they may have just pulled that tool from the Image. MotionEYE is like that as well. Can you load homebridge over Rasberry PI OS, or do you have to use the prebuilt package?

Try tar on the command line and see if that works at all for you.

1 Like

That was it - I installed zip and it is working. Thanks for your help.

Just out of curiosity, the backup on my Mac (dev) instance was > 160MB and the Pi is 17MB. Is the size of your backup on the Pi around what mine is? I do have extra nodes installed that I was playing around with on the Mac but was wondering if the backup on the Pi missed some stuff.

I am in a bit of a different setup. As far as node red it is running on a home server running Unraid. Node Red runs in a Docker and there is a plugin for unraid that does a backup for docker apps on a scheduled basis. That is what is doing my backup. This all started with me wanting to get the backup of the Hubitat C7 hub off that device and onto the Unraid server. I got that with the information that Royski provided earlier. My backup of the node red Docker also wouldn't be comparable.

That said going from 160mb to 17mb is dramatic. Maybe something different from the compression algorithm between the systems or the base install is much more limited for a pi since it is a arm cpu.