Hubitat > NodeRed > MySQL > Grafana (LONG READ)

Just a warning: This is a LONG read.

I'm a NodeRed junkie for home automation tasks. I find it easy to do everything from network monitoring to rotating log files and firing off complex automations in Hubitat that I don't find Rule Manager suitable for.

With that said, the topic here is getting your logs and events from your Hubitat hub and finally into Grafana. I chose Hubitat because I want everything running in my house locally (with some exceptions like Alexa/Google Home, etc). Part of that means my logs and events aren't going to get shipped out to a third party. That's what this project solves. Having an enterprise IT background, I want logs going back as far as I can and right now, Hubitat doesn't provide that (nor do I ever expect them to because I want my hub to be a hub and nothing else). I also wanted a solution that didn't require any custom code or drivers within the Hubitat hub itself. So, I came up with this solution.

Hardware: What do you need?

You'll need an always-on "server" (or servers depending on what you have at hand) for running NodeRed, MySQL and Grafana. My platform of choice is Linux, but you could accomplish the same thing on Windows or Mac. For hardware, you could run this using something as simple as a Raspberry Pi, but be warned that this will probably tax that poor little machine to death, especially if you have a busy house with a lot of devices. If you have 2 or 3 of them lying around, that'd be a lot better (my current setup is running on a machine with 32GB of RAM and 16TB of drive space). However, I've also done this exact same setup on 2 RPis with a 1TB external drive attached to each. YMMV depending on what hardware you have available to you.

Software: What do you need?

Personally, I prefer Debian based Linux systems, but I know that each of the platforms needed here run on pretty much anything.
  1. NodeRed ( is the brains behind all of this and required to get logs and events out of Hubitat. This needs to be on an always on server.
  2. MySQL: This is pretty much a "pick your flavor" decision. I personally prefer Percona or MariaDB, but any "sql" type of server will work. For this tutorial, I'll be using MariaDB 10.1. This also needs to be on an always-on server.
  3. Grafana ( This doesn't need to be on an always-on server, but I'd highly recommend it.

How do I set it all up?

I'm going to bypass setting up the various server pieces as each of the 3 required components each have excellent documentation for themselves. I'm going to start off assuming you've setup NodeRed, MySQL, and Grafana already and get right into the configurations.

MySQL Configuration

I've created 6 tables in MySQL to handle the logging. I *could* have gone with just 2 tables, but I wanted to separate out battery, temperature, switch and motion events into their own tables for ease of indexing and querying, especially when my hub gets really chatty. With that said, if you feel comfortable with your own table structure, feel free to implement it. :slight_smile: One thing to note about the user. If you are running NodeRed and MySQL on separate servers, you'll need to change the @'localhost' to match your environment and **CHANGE PASSWORD TO SOMETHING SECURE!**

I've created a SQL script for setting up the database, tables and indexes:

This is how your database layout will look after running the SQL file:

With the tables created, let's move on to NodeRed.

Node-Red configuration

You'll need to add 2 NPM packages to your pallet in NodeRed (3-slash menu > Manage Pallet > Install).

Once you add those two packages, everything else is pretty easy to assemble. My flow below looks complicated. It actually isn't and I'll go over each part.

This part deals with logging everything from Hubitat's two exposed websockets, eventsocket and logsocket. What I've done is created two websocket listeners and pointed them to ws://[hubitat_ip]/eventsocket and ws://[hubitat_ip]/logsocket. These catch everything that you see in the /logs page on your hub.

Logs File Logger and Events File Logger each writes the raw log/event data to files. The logger node is part of the advance-logger npm package. I have it configured to write a log file on my server that gets split at 500mb and also gets rotated as needed.

For the "CREATE * INSERT" nodes, these are function nodes each one just creates an INSERT statement for the Logging DB node.

Create Logs Insert:

Create Events Insert:

For Create Motions and Create Switches, I use a switch node to switch on

Create Motions Insert:

Create Switches Insert:

Finally, configure your Logging DB node and hit deploy. Within a few seconds, you should start seeing data flowing into your MySQL tables.

For the temperature and battery values, I found that using websockets didn't work out well because there's no rhyme or reason on some battery devices as to when they report in. So, the Maker API comes to our rescue (you DO have the Maker API setup, right?!). To access the Maker API from NodeRed, we use a http request node. Use a GET request and point it to http://[hubitat_ip]/apps/api/[app_id]/devices/all?access_token=[access_token].

The first node in this section is the Inject node. The Inject node is probably one of the most versitile nodes in all of NodeRed. For temperatures and batteries, I have it configured as follows:

This section functions a bit differently from websockets as we have to parse out the device list from the Maker API. On a large amount of devices, it could take a minute or two for the full device list to be returned from the hub. However, once you have your device list, we pass it through a function node looking for temperature or battery capabilites:

Then, we pass that through a simple split node, splitting on msg.payload. Finally, pass that through to a CREATE INSERT function (as shown above), and then to the database. Deploy again and you should see data in your MySQL tables.

The last part of the flow is error handling. Currently, I just pass the error to the debugger, but there are lots of things you could do here like alerts, emails, tweets, etc. The sky is the limit really.


That's it for the NodeRed configuration. At the end of the post is the exported flow for you to import, should you choose to.

Grafana Configuration

Lastly, we come to Grafana. Setting up Grafana is extremely simple, but to use this solution, you'll need to add the MySQL DB Data source plugin. After that, it's simply a matter of creating your dashboards as you see fit.

This is my temperatures dashboard that reads from the temperatures table. The setup here is extremely simple as it's a combination of a table, a graph and two singlestats. Basically, read values from MySQL and display in graphs. :wink:

If you've made it this far, thank you. It looks complicated, but once you have it up and running, it pretty much removes the need for ANY external resources. I've tested this on a completely isolated network with no internet access and it works flawlessly.

To import this flow, import the JSON contained here:


That is a seriously great write-up. As I understand it this method should not put any additional load on your hub because it is connecting to the websockets ? I have seen other Grafana solutions mentioned here that seem to use an app and it seems to slow down the hub on some occasions.

Do you know if these websockets will be around for a while ?

1 Like


Just the calls to the Maker API would add to the load of the hub and I have mine set for 15 minutes and 6 hours.

Honestly, I have no idea if they will or won't be around. I know I'm not the only one using them though, so hopefully HE support sees that and would say something before axing them.

1 Like

Thanks for this, I have a new Raspberry Pi arriving tomorrow just for this purpose - this post is very timely :smiley:

1 Like

Just looking at this and wondering why we need to have node red here for monitoring the hub events?

You are connecting with the event and log streams? Is a server needed or can this be done all client side? I'm working on something that will function as a monitor of hub performance and trying to work out what information I can extract. Please forgive my naivety I see Grafana as a great dashboard but don't see how node red is needed to access the hub info ...

NodeRed isn't "needed", per se. You just need something that can act as a websocket client and import into MySQL. I actually started this whole thing with a couple of nodejs scripts to subscribe to the websockets and pushed out to flat files.

As for what performance stats you can get, not many that I have seen. At least not hub stats. It's been one of my requests to have the Maker API expose more hub data that can be used for instrumentation.

One problem is that the logs don't really go far back enough to track down certain problems.

Last night I had an issue with the app that supports the device handlers I've written - the app had crashed somehow and was removed, but that led to lots of errors from the device handlers (as child devices), which swamped whatever the original error was because of the limited log storage on the hub itself.

Having an external log server would have helped me track down the issue.

1 Like

Awesome job @corerootedxb. All of a sudden I have an extra pi also a C5 hub. This will definitely be much easier for a noob like me to get node red going. Thanks!

1 Like

Good to hear! Funny story... Last time my wife and I went to Target, we were walking around and she saw some RPis on a shelf and she looked at me and said "Oh look! Those little computers you like to use! How many are you getting?" I cleaned them out and got the 4 they had on the shelf.

I got lucky in the wife lottery. Lol


Good post. I'm to lazy for all this, so I just log the event stream to disk.

1 Like

@corerootedxb - this is a great write up and exactly what I have been wanting to do but the “plugins” developed for influxdb or grafana cause too much overhead to HE.

So you recommend not to use a RPi but is this truly the case? I have a 3B+ doing nothing right now and I am most interested in how apps perform. How are you handling app events?

You could use a single RPi. I just don't recommend it as NodeRed+MySQL+Grafana will chew through 1GB of RAM and trash a cheap sd card easily. If you can separate the servers out, that'd be a lot better.

MySQL can be tuned to run in a smaller memory space, but I generally say that using a RPi as a DB server is never a good idea; There's just too much disk and ram thrashing going on. Maybe as a NoSQL server it would be better and you can swap out MySQL for something like MongoDB or another NoSQL server.

With all that said, try it! If you experience bottlenecks, order a second (or third) RPi! :slight_smile:

For me, events go into their own table which I can then search on. So, if I want to see everything that an app has done over the past week, I can query that. In Grafana, I'm tracking how often an app "activates" based upon that table. I'm not doing much beyond that as I try to keep my hub as clean as possible. I'm only running 5 custom apps right now and none of those are very heavy apps (besides Alexa TTS).

Other cool things that I can do with this:
- create heat maps of which motions sensors get activated the most
- track how long lights have been turned on and at what level to calculate energy usage
- Exception tracking in NodeRed that sends Pushover notifications when exceptions get thrown in HE.

1 Like

@corerootedxb, it's great when you see someone who has the knowledge to put all the pieces of this many systems together in a way that gets this much flexibility. It's rare that this same person would spend the time to make a step by step manual (with pictures even) and lay it out in a way that makes the challenge seem a whole lot less daunting to a layman. It's like you are trying to take @ogiewon's thrown or something :wink:.

This might be the reason I use (there have been many before) to finally jump into the Node-Red rabbit hole. I even have a spare pi waiting for a mission...actually can node-red run on a pc :thinking:..will have to check that. I already have Infludb with Grafan configured but had to disable it because I'm trying to tshoot biweekly hub crashes and I think that is the culprit.

In any case, I think this and finally setting up my HE dashboards again will be my next mini-projects.


Thanks Stephan. I appreciate it.

I'm not after anyone's throne (especially @ogiewon's; That man does INCREDIBLE work for the community and I was a fan of his on the ST forums as well). I think my main goal here is to demonstrate that a lot of things that people want out of HE can be accomplished easily without overloading the hub itself. That's my goal for my hub, at least. I want maximum logging and reporting without bogging down the hub.

I'm working on a little side project that accomplishes this same goal without NodeRed; A simple nodejs script (for those that are NodeRed adverse). I should be done with it by this afternoon. :smiley:


All hail the new King! :wink:

Seriously though, I am very interested in learning more about how to use Hubitat as a provider of webSockets data. If somehow this is a lighter-weight method of sending data to InfluxDB, I am all for it! Since InfluxDB has to run on a separate box anyway, I wouldn't mind running a NodeJS app on that same system to collect and store Hubitat Elevation data.

How well versed are you in InfluxDB? I could use a hand on that part if you are up to doing some nodejs coding?

Not too skilled at InfluxDB at all. The InfluxDB App doesn't appear to me like it should be causing hub slow-downs/crashes. I even modified it to use Async Http calls to see if that would help. You should be able to steal from the existing groovy code, although going even simpler may be better (i.e. the InfluxDB Logger App seems to send quite a bit of data for each event - why not just a name/value pair?)

1 Like
const Influx = require('influx');

module.exports = {
    write : function(data, dest) {
        const influx = new Influx.InfluxDB({
            port: dest.port,
            username: dest.user,
            password: dest.pass,
            database: dest.database,
            schema: dest.schema

              measurement: 'response_times',
              tags: { host: os.hostname() },
              fields: { duration, path: req.path },

This is what I have so far in Node for influx. I'll take a look at that code and see if I can convert the Groovy calls over to Node and hopefully have a test release of the app in about an hour. :smiley:

1 Like

I have the basic app complete, but I haven't been able to get InfluxDB working yet. I'm moving the files up to github now and will append this post when I get the readme.db completed.

Who wants to be a guinea pig today? :smiley:

Github is up. Be warned. Dragons be here and danger and crap... Ok, not really danger, but the code is far from final. It's workable though and doesn't bomb out too much :wink:

Is this part in bold correct? Putting a unique constraint on deviceId is kicking back an ER_DUP_ENTRY for each switch insert after the first one for a given deviceId. Same with motions.