Hubitat > NodeRed > MySQL > Grafana (LONG READ)

OK ... answered my own question. Reading up a bit more on the InfluxDB flows, it turns out that using 'influx batch' is the porper way to handle the array values that I was seeing in the debug payload. I was using the standard 'influxdb out' to start, but switching to 'influx batch' now parses out the measurement type and values as expected!

image

My first question is still outstanding --- do I need to read the entire message, or is just the payload an option ... or does it really make any difference as long as it works?

You need the entire message because the event socket messages are in a Json format that NodeRed won't push directly to msg.payload. However, in terms of performance, it is fine as NodeRed disposes the message object when it goes out of scope anyhow. Think of needing the entire message object as a simple inconvenience.

You could stuff the entire message object into msg.payload by using something like the Switch node, but there really isn't any need to unless your flow gets more complicated.

@corerootedxb Thank you for the explanation! My concern was the performance impact, but based on your description it is a minimal, if any, impact and a necessary evil to accomplish the logging.

1 Like

No problem! There are other ways as both the event socket and log socket can be captured from just about any programming language. With what Hubitat gave us with the sockets, the sky is the limit in terms of reporting. I honestly can't tell you the last time I even looked at the Logs page on my hub.

Thanks SO much for this post and the previous. I just jumped down the Rabbithole this morning, and was managing to get my data to the debug window, but immediately started running into the "No Measurement Specified" error.

Now, I'm beginning to see Humidity, Power and Temperature data in the "MEASUREMENTS" (table?). Woot! Of course GRAFANA remains to be figured out, so I may not be out of the woods yet.

Question to all: do I need to set any filters in the filter box? I have, but I don't know if that is limiting the messages that get passed to Set Influx Data or not...and if it is, can I use a filter like msg.payload contains * to get all messages?

Thanks!
S.

Yet another voice chiming in to say thanks for all of the hard work on this tutorial and the contributions from helpful community members. I've finally succumbed to the temptation and put together the setup from @corerootedxb's original post above, including the MySQL (MariaDB) database.

A question for the database pros, though: I see that a lot of folks prefer InfluxDB over MySQL. That intuitively makes sense, as I understand that the former is geared toward time series data. Plus, it sounds like folks have had headaches with timestamp alignment between MySQL and Grafana, which sounds like the sort of thing that would drive me nuts.

I've considered switching to InfluxDB since I'm not too far in. However, are there limitations I should be aware of before making a change? I thought I'd read something about InfluxDB being able to keep a longer history, but only keeping some subset of descriptive statistics. (I'm a database n00b, so please forgive my ignorance.)

TL;DR Experienced Hubitat data wranglers, should I stick with MySQL or jump ship to InfluxDB?

1 Like

influxdb all the way for me, its integration with Grafana is wonderful. You can set up the tables to keep the data forever, or expire it at a certain age. One of the best features is when you add a device to HE, its data just starts showing up on the Grafana charts with no edits or adds needed at all.

3 Likes

I use both actually. I use Influx for graphs but use MySQL for other things. I push the events and logs into MySQL for historical searching purposes. I have a MySQL table called batteries that I update the reported battery value and then a last updated time stamp every time any sort of event happens on that device so I can see if a battery device dies. This is done via DB trigger on events table. I have a table in Grafana that shows this table data.

2 Likes

Thanks for the input, y'all. It sounds like InfluxDB is worth a shot, at least in parallel with the MySQL database. I'm already looking forward to checking out all of the good stuff here and in other posts on the subject. :nerd_face:

1 Like

With the new Maker API POST Url change, I'm going to redesign my NodeRed flow to take advantage of it. I'll post the new flow when I get it done.

3 Likes

I really appreciate the HE team upgrading the Maker Api for better "bidirectional" support. Being able to selectively push out data rather than having to use the floodgate approach of websocket (PLEASE do not remove btw) is an awesome option. I dabbled with the new Maker Api on my dev hub during the beta and was very impressed with how well it worked. It was only for a few devices as a test but I look forward to using on my main hub and seeing how it holds up.

I also really look forward to how you approach utilizing this as you are my NR/database guru.
Show us the way oh wise one.:bowing_man:

1 Like

I keep telling my coworkers that and they just laugh at me. lol

1 Like

Do you have any recommendations for completely code illiterate people (like myself) to any sort of learning this for dummies area. Maybe Youtube or other form of documentation.

I recently got a Qnap Nas and would like to implement some of this stuff, and can follow your instructions above, but I think first I should familiarize myself with some of the terminology and areas you're instructing on above first.

@stephack could you share the json file for your nodered flow and explain a bit more how you got your log to look so nice in a spreadsheet?

I'll have to export it for you tomorrow when I can get to my desktop.

To be honest, I'm not sure I remember all the steps but it really boils down to how comfortable you are with data importing and manipulation in excel. I'm simply importing the log files I created into excel as a delimited text file and then applying some formatting, etc. I'll try to list as many steps as I can remember tomorrow as well.

1 Like

Quick node by node breakdown

  1. pulls data from the logsocket
  2. extracts the data I wanted and converts to delimited text file (I chose ||| as this shouldn't appear anywhere in the logs normally)
  3. fixes anomalies that occurred when logs contains special characters or actual html.
  4. saves the log file to location of choice

5 - 9. sets the maximum size of log files, when reached, archives and created a new log file
10-12. sets maximum number of logs files to keep and deletes the oldest.

Log size and number of logs can be easily changed to what you need.

Below is my exported flow for logging. You will need to point the first node to the logsocket you created and replace the log path and archive path with your specific locations.

Excel details to come later.

[
    {
        "id": "9c740a.039f3bf8",
        "type": "file",
        "z": "43aea04d.3bd6e",
        "name": "Ouptut to he_logs.log",
        "filename": "/yourPath/he_logs.log",
        "appendNewline": true,
        "createDir": false,
        "overwriteFile": "false",
        "encoding": "none",
        "x": 820,
        "y": 260,
        "wires": [
            [
                "6b7bba59.42d0c4"
            ]
        ]
    },
    {
        "id": "6b7bba59.42d0c4",
        "type": "fs-ops-size",
        "z": "43aea04d.3bd6e",
        "name": "Log Size",
        "path": "/yourPath",
        "pathType": "str",
        "filename": "he_logs.log",
        "filenameType": "str",
        "size": "filesize",
        "sizeType": "msg",
        "x": 260,
        "y": 340,
        "wires": [
            [
                "b0409461.ec2518"
            ]
        ]
    },
    {
        "id": "cd31194d.e094c8",
        "type": "fs-ops-move",
        "z": "43aea04d.3bd6e",
        "name": "Archive Log",
        "sourcePath": "/yourPath",
        "sourcePathType": "str",
        "sourceFilename": "he_logs.log",
        "sourceFilenameType": "str",
        "destPath": "/yourArchivePath",
        "destPathType": "str",
        "destFilename": "newname",
        "destFilenameType": "msg",
        "link": false,
        "x": 930,
        "y": 340,
        "wires": [
            [
                "da05758.c7ae488"
            ]
        ]
    },
    {
        "id": "da05758.c7ae488",
        "type": "fs-ops-dir",
        "z": "43aea04d.3bd6e",
        "name": "# of Logs",
        "path": "/home/stephack/backup/nodered/main/logs/log_archive",
        "pathType": "str",
        "filter": "*",
        "filterType": "str",
        "dir": "files",
        "dirType": "msg",
        "x": 400,
        "y": 420,
        "wires": [
            [
                "73b6cf78.d6a33"
            ]
        ]
    },
    {
        "id": "7d3a757b.d97bdc",
        "type": "fs-ops-delete",
        "z": "43aea04d.3bd6e",
        "name": "Del Oldest Log",
        "path": "/yourArchivePath",
        "pathType": "str",
        "filename": "files[0]",
        "filenameType": "msg",
        "x": 720,
        "y": 420,
        "wires": [
            []
        ]
    },
    {
        "id": "ff5df71f.907cc8",
        "type": "moment",
        "z": "43aea04d.3bd6e",
        "name": "",
        "topic": "",
        "input": "timestamp",
        "inputType": "msg",
        "inTz": "America/New_York",
        "adjAmount": 0,
        "adjType": "days",
        "adjDir": "add",
        "format": "YYYY-MM-DD_hh-mm-ss",
        "locale": "en_US",
        "output": "newname",
        "outputType": "msg",
        "outTz": "America/New_York",
        "x": 620,
        "y": 340,
        "wires": [
            [
                "7fd311fb.e9d21"
            ]
        ]
    },
    {
        "id": "7fd311fb.e9d21",
        "type": "string",
        "z": "43aea04d.3bd6e",
        "name": "\".log\"",
        "methods": [
            {
                "name": "append",
                "params": [
                    {
                        "type": "str",
                        "value": ".log"
                    }
                ]
            }
        ],
        "prop": "newname",
        "propout": "newname",
        "object": "msg",
        "objectout": "msg",
        "x": 790,
        "y": 340,
        "wires": [
            [
                "cd31194d.e094c8"
            ]
        ]
    },
    {
        "id": "b0409461.ec2518",
        "type": "switch",
        "z": "43aea04d.3bd6e",
        "name": "Archive If > 2MB",
        "property": "filesize",
        "propertyType": "msg",
        "rules": [
            {
                "t": "gt",
                "v": "2000000",
                "vt": "num"
            }
        ],
        "checkall": "true",
        "repair": false,
        "outputs": 1,
        "x": 420,
        "y": 340,
        "wires": [
            [
                "ff5df71f.907cc8"
            ]
        ]
    },
    {
        "id": "73b6cf78.d6a33",
        "type": "switch",
        "z": "43aea04d.3bd6e",
        "name": "15 Log Limit",
        "property": "files.length",
        "propertyType": "msg",
        "rules": [
            {
                "t": "gte",
                "v": "15",
                "vt": "num"
            }
        ],
        "checkall": "true",
        "repair": false,
        "outputs": 1,
        "x": 550,
        "y": 420,
        "wires": [
            [
                "7d3a757b.d97bdc"
            ]
        ]
    },
    {
        "id": "62fe0342.ad756c",
        "type": "comment",
        "z": "43aea04d.3bd6e",
        "name": "format/create log",
        "info": "",
        "x": 1040,
        "y": 260,
        "wires": []
    },
    {
        "id": "eb6371f4.6c077",
        "type": "comment",
        "z": "43aea04d.3bd6e",
        "name": "archive when too big",
        "info": "",
        "x": 1130,
        "y": 340,
        "wires": []
    },
    {
        "id": "90358433.c12f58",
        "type": "comment",
        "z": "43aea04d.3bd6e",
        "name": "maintain 15 log archive",
        "info": "",
        "x": 940,
        "y": 420,
        "wires": []
    },
    {
        "id": "907fadd6.c24b8",
        "type": "change",
        "z": "43aea04d.3bd6e",
        "name": "Format CSV delim (||) ",
        "rules": [
            {
                "t": "set",
                "p": "payload",
                "pt": "msg",
                "to": "msg.level & '||' & msg.type & '||' & msg.id & '||' & msg.name & '||' & msg.time & '||' & msg.msg\t",
                "tot": "jsonata"
            }
        ],
        "action": "",
        "property": "",
        "from": "",
        "to": "",
        "reg": false,
        "x": 440,
        "y": 260,
        "wires": [
            [
                "5989b2b4.2dacec"
            ]
        ]
    },
    {
        "id": "5989b2b4.2dacec",
        "type": "string",
        "z": "43aea04d.3bd6e",
        "name": "Decode Html",
        "methods": [
            {
                "name": "decodeHTMLEntities",
                "params": []
            }
        ],
        "prop": "payload",
        "propout": "payload",
        "object": "msg",
        "objectout": "msg",
        "x": 630,
        "y": 260,
        "wires": [
            [
                "9c740a.039f3bf8"
            ]
        ]
    },
    {
        "id": "53189136.8abce",
        "type": "debug",
        "z": "43aea04d.3bd6e",
        "name": "",
        "active": false,
        "tosidebar": true,
        "console": false,
        "tostatus": false,
        "complete": "true",
        "targetType": "full",
        "x": 400,
        "y": 200,
        "wires": []
    },
    {
        "id": "c0527395.c6582",
        "type": "websocket in",
        "z": "43aea04d.3bd6e",
        "name": "HE Logs Socket",
        "server": "",
        "client": "b2f3490e.fc20e8",
        "x": 240,
        "y": 260,
        "wires": [
            [
                "907fadd6.c24b8",
                "53189136.8abce"
            ]
        ]
    },
    {
        "id": "b2f3490e.fc20e8",
        "type": "websocket-client",
        "z": "",
        "path": "ws://192.168.1.111/logsocket",
        "tls": "",
        "wholemsg": "true"
    }
]
3 Likes

This is great, thanks! Turning the csv into an excel spreadsheet that looks good is probably something I can fumble through. But with the Node red stuff I’m pretty clueless so thanks again!!

1 Like

This is a snippet of what the output log file looks like:

I then import this into excel and use "||" as the custom delimiter:

Once loaded on the spreadsheet, ytou should end up with a page that looks like this:

I set up the spreadsheet to refresh the data when the file is opened.
I also do some formatting, sorting, quick access to archived logs and other customizations using the Power Query Editor but those steps are a bit too detailed to share here...and more for my personal use case. PM me if you are familiar with the Power Query editor and want to dive a bit deeper.

3 Likes

So I've imported the node red flow. I added "node-red-contrib-fs-ops" to try to get it to work. Was that correct? I think the output into the log file doesn't look right in some cases. It seems like some things are being cut off.

info||dev||1164||Living Room Window 2||2019-11-03 08:39:17.999||Living Room Window 2 temperature is 62.97°F

info||app||1771||Living Room Recessed Lights - off only||2019-11-03 08:39:30.358||Motion inactive Living Room Motion Sensoinfo|info||app||1709||Study Lights - Occupancy||2019-11-03 08:39:30.392||Not turning off: disabled by Living Roominfo||app||1771||Living Room Reinfo||dev||586||Kitchen Motion Sensor||2019-11-03 08:39:31.345||Kitchen Motion Sensor is 33%RH 
gyDuration is 96.55 Days 
info||dev||1483||Outdoor Motion Sensor||2019-11-03 08:39:37.120||Outdoor Motion Sensor illuminance is 2604 Lux
info||app||1587||Living Room Fan<span style='color:red'> (paused)</span>||2019-11-03 08:39:39.523||Motion on info||app||1709||Study Lights - Occupancy||2019-11-03 info||app||1771||Living Room Recessed Lights - off only||2019-11-03 08:39:39.5info||app||1588||Study Fan<span style='color:retrace||app||743||Morning Wake Up Motion Zone Controller||2019-11-03 08:39:39.564||Qualifying triggers were notinfo||app||1info||dev||1106||Living Room A/C Power Switch||2019-11-03 08:39:40.204||Living Room A/C Power Switch power iinfo||dev||1483||Outdoor Motion Sensor||2019-11-03 08:39:47.062||Outdoor Motion Sensor illuminance is 2599 Lux
info||app||1771||Living Room Recessed Lights - off only||2019-11-03 08:39:51.228||Motion inactive Living Room Moinfo||app||1709info||app||1709||Study Lights - Occupancy||2019-11-03 08:39:51.257||Not turning off: disabled by Linfo||app||1771||Living Room Recessed Liginfo||app||1771||Living Room Recessed Lights - off only||2019-11-03 08:39:54.335||Motion inactive Living Room Minfo||app||1709|info||app||1709||Study Lights - Occupancy||2019-11-03 08:39:54.359||Not turning off: disabled by info||app||1771||Living Room Recessed Lighinfo||app||1709||Study Lights - Occupancy||2019-11-03 08:39:58.031||Motion on additional motion sensor Livinginfo||app||1588||Study info||app||1587||Living Room Fan<span style='color:red'> (paused)</span>||2019-11-03 08:39:58.050||Motion on additional motion sensor info||app||1771||Living Room trace||app||743||Morning Wake Up Motion Zone Controller||2019-11-03 08:39:58.052||Qualifying triinfo||app||1771||Living Roinfo||dev||1483||Outdoor Motion Sensor||2019-11-03 08:40:02.051||Outdoor Motion Sensor illumininfo||dev||618||Boys’ Room Temp/Humidity Sensor||2019-11-03 08:40:06.060||Boys’ Room Temp/Humidity Sensor: humidity is 35%
.6°F

Can you open the log file in Notepad (fullscreen) and take a screenshot? I can't quite read the info above.