Unraid User Scripts and hub Backups

anyone here using one of the backup scripts that can be found here along with Unraid users scripts?

I've tried a couple scripts that use CURL to get he backups but am not having any luck and am wondering if its unraid specific.

The commands to create a backup have nothing really to do with any particular OS, they're just HTTP urls. You could invoke them with a browser, if for nothing else than to test that you've got the syntax correct and a backup file is generated.

How you invoke the http command is, of course, specific to whatever OS or NAS or scheduler utility you have available. Does Unraid have a scheduler tool? If it's Linux based you might be able to use cron, but you're best bet might be to ask in the Unraid user forums.

There is a user scripts feature that lets one schedule scripts. This is what I am trying to use.

curl http://192.168.0.94/hub/backup | grep data-fileName | grep download | sed 's/<td class=\"mdl-data-table__cell--non-numeric\"><a class=\"download mdl-button mdl-js-button mdl-button--raised mdl-js-ripple-effect\" href=\"#\"  data-fileName=\"/\ /' | sed 's/\">Download<\/a><\/td>/\ /' | sed 's/ //g' | tail -1 | xargs -I @ curl -o @ http://192.168.0.94/hub/backupDB?fileName=@

And when I run it I get this

Script location: /tmp/user.scripts/tmpScripts/Hubitat backup/script

Note that closing this window will abort the execution of this script
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 44103 0 44103 0 0 94845 0 --:--:-- --:--:-- --:--:-- 94845
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0Warning: Failed to create the file
Warning: Warning: -effecthref=#data-fileName=2020-12-15~2.2.4.158.lzf>Download: No
Warning: such file or directory
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
curl: (23) Failed writing body (0 != 1166)

I just set this up with Unraid userscripts the other day based on something I read on here. I'll try to find the original post to provide credit and edit this one accordingly. (Edit: this is the post I used to help set it up for me: How Hubitat Backup/Restore has Saved My â– â– â–  - #11 by Abhijeet_Ghosh)

I have two set up.

The script to save the backup:

#!/bin/bash
wget --content-disposition -P /mnt/disk10/backup/hubitat/ http://192.168.0.69/hub/backupDB?fileName=latest
echo Hubitat Hub backup completed

And the script to trim the number of backups kept:

#!/bin/bash
find /mnt/disk10/backup/hubitat/ -mindepth 1 -type f -mtime +14 -delete
echo Hubitat backups limited to 14 days

Obviously the path and IP are my own. Truth be told I haven't been running it for 2 weeks yet so I can't confirm that the auto-delete of ones older than 14 days works, but the daily backups are in fact saving to the appropriate path, with the date and firmware version in the file name.

Hope this helps!

That worked great, I am not collecting weekly backups of my hubs.