[SOLVED] Downloading Latest Backup File

To fix this, the change the $5 to a $4, as in:
curl -k -sb $cookiefile https://$he_ipaddr/hub/backup | grep download | awk -F"=" '{ print $4}' | awk -F'"' '{print $2}' | sed '/^$/d' | tail -1 | xargs -I @ curl -k -sb $cookiefile https://$he_ipaddr/hub//backupDB?fileName=@ -o $backupdir/MainHubitat_$(date +%Y-%m-%d-%H%M).lzf

although I'd simplify the middle and just use

curl -k -sb $cookiefile http://$he_ipaddr/hub/backup | awk -F'["=]' '/class=.download/ { file=$9 } END { print file }' | xargs -I@ curl -k -sb $cookiefile https://$he_ipaddr/hub//backupDB?fileName=@ -o $backupdir/MainHubitat_$(date +%Y-%m-%d-%H%M).lzf

2 Likes

Mine (happily stolen from someone else smarter than me, but can't remember who) is still working on my Synology NAS:

On Synology NAS scheduler:

Task 1, Daily: Copy backup file from Hub to NAS:

cd "/path where you want to store backups"; /usr/bin/wget http:///hub/backupDB?fileName=latest --backups=14 --quiet

Set "--backups=14" to the number of backups you want to keep before removing older ones. I save two weeks worth before starting to drop off older ones.

Task 2, Weekly: Creates a copy of the current backup once a week and saves it - these are not automatically removed - they are kept until I manually delete them.

cd "/path where you are storing backups"; /bin/cp backupDB?fileName=latest hubitat_$(date +%Y%m%d).lzf

2 Likes

Thanks @aaiyar.

Would I be right in thinking that one of the differences in your script is that it is downloading the latest backup file already captured by the HE hub automatically, whereas the other methods listed are requesting the HE hub to perform a fresh backup at the time the HTTP calls are made, then downloading that file?

Simon

It’s actually the other way about. The other one is better.

Hmmm, I thought latest was simply downloading the file. Are saying that part is correct, but the other is the preferred method?

No, I’m saying that my method creates a new backup. The other method downloads the last one (which I think is better).

Ah, then yes, I agree, the download is what I am after, which I always remembered was a pain to get the syntax right. That makes sense now, thanks.

1 Like

It does look like I was only saving the web page and not the backup, adjusted my script to use @jlv 's version and it is now matching the size of the latest download already captured on the HE hub. Thanks. @jlv - You do appear to be missing a space between the I and the @ symbol in the second example.

If anyone can work their magic (and if it is possible) to capture the time the backup was taken on the hub into the file name stored, rather than the time it was downloaded, that would be awesome.

Simon

3 Likes

This comes close....

github

Hubitat/Hubitat-Configuration-Backup.ps1 at master Β· BlueBlock/Hubitat Β· GitHub

1 Like

I think if you modified your script to pass in the file name instead of "latest", it would download the last backup. The url appears to be:

http://YOUR_HUB_IP/hub/backupDB?fileName=YYYY-MM-DD~FW_VERSION.lzf (eg: http://192.168.1.ZZZ/hub/backupDB?fileName=2021-03-28~2.2.5.131.lzf)

The date in YYYY-MM-DD format would be determined by your backup schedule, Still haven't figured out the time stamp (that's only on the web page, not part of the file name)

EDIT: I think @jlv suggested the same thing!

1 Like

It looks like this script is also creating a new backup (not getting the latest) as it is using the same url as @aaiyar's script ($url = "$($urlhub)/hub/backupDB?fileName=latest").

I noticed that as well, and (without testing) I suspect the get-date may return the current date at the time the script is being run.

It does but I think it's being used as part of the filename for the downloaded file. I could be wrong though as I don't speak PowerShell script language!

Yes, the thing I was after was being able to not only download the last backup file, but also get the date/time it was produced by the HE hub, not when I downloaded it

The space isn't necessary. xargs allows it's arguments to cuddle.

1 Like

I've gotten this to work in node-red. Since I don't know curl or shell script, maybe someone who understands both can translate this for people who don't use node-red?
(@aaiyar @jlv @Vettester?)

The flow uses node-red-contrib-moment and node-red-contrib-credentials node apart from the standard nodes. In addition, it uses the Hub Information driver from @thebearmay to get hub name, fw version, local ip etc.

NOTE: This flow downloads the current day's backup and should be scheduled AFTER the time HE does its internal backup. It will save the file into the directory specified with the date and time as part of the file name:

Node-RED flow:

[{"id":"f527fe93.c5a7b","type":"subflow","name":"Hubitat Login","info":"","category":"","in":[{"x":40,"y":120,"wires":[{"id":"a9ea0344.3e1b6"}]}],"out":[{"x":1336,"y":122,"wires":[{"id":"fa6d0c33.1064b","port":0}]}],"env":[],"color":"#DDAA99"},{"id":"fa6d0c33.1064b","type":"function","z":"f527fe93.c5a7b","name":"SetCookies","func":"var cookies = msg.responseCookies;\n//Set session cookies for the response header\nmsg.headers = {\n    Cookie : \"HUBSESSION=\" + cookies.HUBSESSION.value};\nreturn msg;","outputs":1,"noerr":0,"initialize":"","finalize":"","x":1190,"y":122,"wires":[[]]},{"id":"70a7ba18.feebfc","type":"http request","z":"f527fe93.c5a7b","name":"Login to Hubitat","method":"POST","ret":"txt","paytoqs":"ignore","url":"http://192.168.1.113/login","tls":"","persist":false,"proxy":"","authType":"","x":974,"y":122,"wires":[["fa6d0c33.1064b"]]},{"id":"119114df.454933","type":"function","z":"f527fe93.c5a7b","name":"SetRequestHeaders","func":"var cookies = msg.responseCookies;\nmsg.headers = {\n    Cookie : \"HUBSESSION=\" + cookies.HUBSESSION.value,\n    'content-type':'application/x-www-form-urlencoded'};\n\n//Hubitat login username and password for form data\nvar data = 'username='+msg.username+'&password='+msg.password+'&submit=Login';\nmsg.payload = data;\nreturn msg;","outputs":1,"noerr":0,"initialize":"","finalize":"","x":733,"y":122,"wires":[["70a7ba18.feebfc"]]},{"id":"d9c504c8.4ccdd8","type":"http request","z":"f527fe93.c5a7b","name":"Get Cookie","method":"GET","ret":"txt","paytoqs":"ignore","url":"http://192.168.1.113/login","tls":"","persist":false,"proxy":"","authType":"","x":506,"y":122,"wires":[["119114df.454933"]]},{"id":"a9ea0344.3e1b6","type":"credentials","z":"f527fe93.c5a7b","name":"Set Username/Password","props":[{"value":"username","type":"msg"},{"value":"password","type":"msg"}],"x":252,"y":120,"wires":[["d9c504c8.4ccdd8"]]},{"id":"ad0bff2f.aa66a","type":"inject","z":"10bc02b3.e89565","name":"","props":[{"p":"payload"},{"p":"topic","vt":"str"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"","payloadType":"date","x":168.66665649414062,"y":2007.666748046875,"wires":[["3ae3fdc3.527ba2"]]},{"id":"3ae3fdc3.527ba2","type":"moment","z":"10bc02b3.e89565","name":"","topic":"","input":"","inputType":"msg","inTz":"America/Chicago","adjAmount":0,"adjType":"days","adjDir":"add","format":"YYYY-MM-DD","locale":"en_US","output":"backupDate","outputType":"msg","outTz":"America/Chicago","x":368.6666564941406,"y":2007.666748046875,"wires":[["7489d38.29bd52c"]]},{"id":"7489d38.29bd52c","type":"subflow:f527fe93.c5a7b","z":"10bc02b3.e89565","name":"","x":603.3333129882812,"y":2006.3334350585938,"wires":[["13a8a275.0f20f6"]]},{"id":"13a8a275.0f20f6","type":"http request","z":"10bc02b3.e89565","name":"Go to HE backup URL","method":"GET","ret":"txt","paytoqs":"ignore","url":"http://192.168.1.113/hub/backup","tls":"","persist":false,"proxy":"","authType":"","x":837.6666564941406,"y":2008.6666870117188,"wires":[["6fb7e653.ea0658"]]},{"id":"6fb7e653.ea0658","type":"html","z":"10bc02b3.e89565","name":"Get Table Data","property":"payload","outproperty":"payload","tag":"td","ret":"html","as":"single","x":1076.3332977294922,"y":2007.33349609375,"wires":[["9645197c.870578"]]},{"id":"9645197c.870578","type":"function","z":"10bc02b3.e89565","name":"Get backup time","func":"\nvar m = msg.payload;\n\n//Get length of attributes array\nvar l = m.length;\n\n//Loop through array to check if html element is backup date/time\nfor (x=0;x<l;x++){\n    var str = msg.payload[x]\n    if(str.substring(0,10)==msg.backupDate){\n     var backupTime = str.substring(11,22)\n     var hour = backupTime.substring(0,2)\n     var min = backupTime.substring(3,5)\n     var sec = backupTime.substring (6,8)\n     var pm = backupTime.substring(9,11)\n     backupTime = hour+\"_\"+min+\"_\"+sec+\"_\"+pm\n    }\n}\n\n//Delete unneeded message elements\nmsg.payload = null;\nmsg.backupTime = backupTime;\nmsg.backupDateTime = str;\n\n//Copy and send the message\n//msg.payload = m; \nreturn msg;","outputs":1,"noerr":0,"initialize":"","finalize":"","x":221.33334350585938,"y":2094.33349609375,"wires":[["ebf576e9.ec12e"]]},{"id":"ebf576e9.ec12e","type":"hubitat device","z":"10bc02b3.e89565","deviceLabel":"Hub Details","name":"","server":"144b18bb.8f6aa7","deviceId":"546","attribute":"","sendEvent":false,"x":431.333251953125,"y":2093.333740234375,"wires":[["72c34444.9d0acc"]]},{"id":"72c34444.9d0acc","type":"change","z":"10bc02b3.e89565","name":"Creating url and filename","rules":[{"t":"set","p":"hubName","pt":"msg","to":"\t\t$replace(\t  payload.locationName.currentValue,\t   \" \",\t   \"_\"\t)\t","tot":"jsonata"},{"t":"set","p":"firmwareVersion","pt":"msg","to":"payload.firmwareVersionString.currentValue","tot":"msg"},{"t":"set","p":"localIP","pt":"msg","to":"payload.localIP.currentValue","tot":"msg"},{"t":"set","p":"filePath","pt":"msg","to":"/Users/rakeshgupta/Documents/Hubitat_Backup/","tot":"str"},{"t":"set","p":"fileSuffix","pt":"msg","to":".lzf","tot":"str"}],"action":"","property":"","from":"","to":"","reg":false,"x":665.3333435058594,"y":2093.33349609375,"wires":[["8588cc24.b0b2e8"]]},{"id":"8588cc24.b0b2e8","type":"function","z":"10bc02b3.e89565","name":"Set URL","func":"//Set URL to download backup\nmsg.url = \"http://\"+msg.localIP+\"/hub/backupDB?fileName=\"+msg.backupDate+\"~\"+msg.firmwareVersion+\".lzf\";\n\n//Set filename for saving download\nvar filename =msg.hubName+\"_\"+msg.backupDate+\"_\"+msg.backupTime+\"~\"+msg.firmwareVersion+msg.fileSuffix;\nmsg.filename = msg.filePath+filename\n\n//Set session cookies for the response header\nvar cookies = msg.responseCookies;\nmsg.headers = {\n    Cookie : \"HUBSESSION=\" + cookies.HUBSESSION.value};\n    \nreturn msg;\n","outputs":1,"noerr":0,"initialize":"","finalize":"","x":892.3333435058594,"y":2093.33349609375,"wires":[["6939d56c.444e44"]]},{"id":"6939d56c.444e44","type":"http request","z":"10bc02b3.e89565","name":"Download Backup","method":"GET","ret":"bin","paytoqs":"ignore","url":"","tls":"","persist":false,"proxy":"","authType":"","x":1090.3333435058594,"y":2094.666748046875,"wires":[["b032f04f.4775e"]]},{"id":"b032f04f.4775e","type":"file","z":"10bc02b3.e89565","name":"Save Backup File","filename":"","appendNewline":true,"createDir":false,"overwriteFile":"true","encoding":"none","x":1310.3333435058594,"y":2093.666748046875,"wires":[[]]},{"id":"144b18bb.8f6aa7","type":"hubitat config","name":"Hubitat - DV","usetls":false,"host":"192.168.1.113","port":"80","appId":"321","nodeRedServer":"http://192.168.1.86:1880","webhookPath":"/hubitat/webhook","autoRefresh":true,"useWebsocket":false,"colorEnabled":false,"color":"#74a7fe"}]

The following data needs to be updated before the flow will work:

  1. Hubitat Login: username/password in the credentials node. If you don't have HE security, you will need to adjust the flow.
  2. File path (directory) where you want to save the backup:

Hope this helps someone - it was a fun little challenge!

2 Likes

Sorry to bother you guys but simple question. I have a windows machine and script that I need to get backup dl automated is where?

You could try the powershell script from an earlier post

2 Likes

Thanks @rakeshg, I'll take a look.

Nice! Worked fine for me.

1 Like