Share Your Data Logging and Visualization Implementations

Like this: The math may change based on your data. but you have to get it into kwh

Here's the JSON for 7 day.

Code
{
"aliasColors": {
"Costs": "#7eb26d",
"kWh": "#eab839"
},
"bars": true,
"dashLength": 10,
"dashes": false,
"datasource": "$powerdatasource",
"decimals": 2,
"editable": true,
"error": false,
"fieldConfig": {
"defaults": {
"custom": {},
"links":
},
"overrides":
},
"fill": 1,
"fillGradient": 0,
"grid": {},
"gridPos": {
"h": 8,
"w": 9,
"x": 0,
"y": 11
},
"hiddenSeries": false,
"hideTimeOverride": false,
"id": 31,
"interval": "1d",
"legend": {
"alignAsTable": true,
"avg": true,
"current": true,
"hideEmpty": false,
"hideZero": false,
"max": true,
"min": false,
"rightSide": false,
"show": true,
"total": true,
"values": true
},
"lines": false,
"linewidth": 2,
"links": ,
"nullPointMode": "connected",
"paceLength": 10,
"percentage": false,
"pointradius": 5,
"points": false,
"renderer": "flot",
"seriesOverrides": [
{
"$$hashKey": "object:6028",
"alias": "Costs",
"yaxis": 2
}
],
"spaceLength": 10,
"span": 12,
"stack": false,
"steppedLine": false,
"targets": [
{
"alias": "Costs",
"dsType": "influxdb",
"groupBy": [
{
"params": [
"$__interval"
],
"type": "time"
},
{
"params": [
"none"
],
"type": "fill"
}
],
"hide": false,
"measurement": "power",
"orderByTime": "ASC",
"policy": "default",
"query": "SELECT sum("value") 0.22/1000 FROM "power" WHERE "host" = 'mainpower' AND $timeFilter GROUP BY time(1d) fill(null)",
"refId": "B",
"resultFormat": "time_series",
"select": [
[
{
"params": [
"value"
],
"type": "field"
},
{
"params": [],
"type": "sum"
},
{
"params": [
"/1000/15
$kwhprice"
],
"type": "math"
}
]
],
"tags": [
{
"key": "deviceName",
"operator": "=~",
"value": "/^$Device$/"
}
]
},
{
"alias": "kWh",
"dsType": "influxdb",
"groupBy": [
{
"params": [
"$__interval"
],
"type": "time"
},
{
"params": [
"none"
],
"type": "fill"
}
],
"hide": false,
"measurement": "power",
"orderByTime": "ASC",
"policy": "default",
"query": "SELECT sum("value") FROM "power" WHERE "host" = 'mainpower' AND $timeFilter GROUP BY time(1d) fill(null)",
"refId": "A",
"resultFormat": "time_series",
"select": [
[
{
"params": [
"value"
],
"type": "field"
},
{
"params": ,
"type": "sum"
},
{
"params": [
"/1000/15"
],
"type": "math"
}
]
],
"tags": [
{
"key": "deviceName",
"operator": "=~",
"value": "/^$Device$/"
}
]
}
],
"thresholds": ,
"timeFrom": "7d",
"timeRegions": ,
"timeShift": null,
"title": "Last 7 Day Power Consumption/Costs",
"tooltip": {
"msResolution": false,
"shared": true,
"sort": 0,
"value_type": "cumulative"
},
"type": "graph",
"xaxis": {
"buckets": null,
"mode": "time",
"name": null,
"show": true,
"values":
},
"yaxes": [
{
"$$hashKey": "object:8675",
"decimals": 1,
"format": "kwatth",
"label": "Usage",
"logBase": 1,
"max": "200",
"min": null,
"show": true
},
{
"$$hashKey": "object:8676",
"decimals": 2,
"format": "currencyUSD",
"label": "Cost",
"logBase": 1,
"max": null,
"min": "0",
"show": true
}
],
"yaxis": {
"align": true,
"alignLevel": null
},
"pluginVersion": "7.3.1"
}

Nothing fancy, just temp reporting from a bunch of motion/multisensors and our two AC units as well as the total usage per day, which I manually load into influx using a small c# application I built to parse the data from my power companys app export function. This is for our part-time house, if anyone wonder why the pwoer usage is so varying per month.

Translations:
Top graph, temperatures and AC unit on/off status
Middle graph, energy usage per day, total (from power company), and usage per AC unit. Our main heating is the AC units so thats why I put them separate.
Bottom graphs: Total power usage/month and year, and power usage per day of the AC units VS outdoor teperature.

3 Likes

Only 2 years later!

HundredGraphs .com has finally got to the Hubitat. If you don't know, HundredGraphs is the service for logging and making graphs for Smart Homes. We mostly angled towards energy collection, however you can log and graph anything you want.
The basic FreeForever account is, duh, free.
However if you want to have longer retention or submit data with less than 10 mins interval between events, you would need a paid account.
To celebrate our new addition we would like to give you a discount.

If you are to get yearly subscription use hubitat12 as your code and you have almost 50% off. The number of discounts is limited though.

The installation is usual, you need to get plugin code from github raw.githubusercontent. com/XLazz/ipstas-hubitat-hundredgraphs/master/smartapps/ipstas/hundredgraphs-logger.src/hundredgraphs-logger.groovy to Developer App code.
Create an User App.
Then you will need your own secret API key from your Hundredgraphs profile
And then select what you would like to report.

It is still considered to be beta, but the code was ported from SmartThings where it works beautifully.

If you would like to get discussion going please use community.hubitat. com/t/release-hundredgraphs-create-graphs-and-alerts

You will find direct links here https://community.hubitat.com/t/release-hundredgraphs-create-graphs-and-alerts/

1 Like

I'm waiting for my Pi to be delivered, :slight_smile: to attempt to map energy usage against my heat pump. I've read about flash drive damage with lots of writes - should I be worried?
I have a Netgear NAS with 10TB of spare disc space, does anyone know if I can use this to install the influxdb on?

As long as you can afford for the flash drive to fail (there's a good chance it won't), both in terms of the cost of the SD card and the potential loss of data, then I would suggest setting up the new rPi using an SD card, and in the meantime look at your options to use your NAS or another storage option. I did almost exactly the same thing last year (InfluxDB + Grafana on rPi4 + SD Card) and have only recently transitioned mine to an external SSD.

I guess what I am saying is there is no real rush, play around with it using the SD Card and you can make the transition later if you wish.

2 Likes

First dashboard up and running, very pleased. :slight_smile:

3 Likes

My SD card lasted about 6 months. When it failed I moved everything over to an always on Mac.

It can also depend on the brand / type of SD card, some are geared towards more writes than others, at least I think that is true.... Either way, @UKMedia, I would suggest starting sometime soon to look into moving to a more appropriate storage media.

The SD Card won't be wasted, when you buy your second or third rPi as a development setup you could use it for one of those :wink:

1 Like

Are there any 'simple' instructions to do this and point to a Netgear NAS? The NAS is wired to the same network. I've tried changing the influxdb.conf file to point to the network address of the NAS mirrored directories for Data, Meta and Wal but the Influxdb service failed to start.

Please note: I'm 2 days into the raspberry pi, command line prompts, Influxdb, etc!

I'm thinking a symbolic link, without knowing how to set one up.

All sorted now, all up and running on my NAS. :slight_smile:

1 Like

hey guys is anybody here running influxdblogger 2.x with the influxdb logger hubitat app? im still new to hubitat and cant seem to get the influx docker server on unraid to receive any info from hubitat using db logger. it keeps coming back with an unauthorized error on the hubitat log. thanks

Nope, I'm using the shell version 1.8.10.

okay that makes sense why 2.x doesnt seem to be working with hubitat becuase most people are still using the versions before. well since i dont know how to code ill wait till someone who does wants to upgrade to 2.x and provides a fix for infuxdblogger for hubitat.

I am using MakerAPI > Node-Red > InfluxDB 2.0
The only challenge here is that you have the filter the events in Node-Red. There was one I found posted on this forum but I didnt really like it so I rewrote most of it. Once I setup a filter to only allow the stuff I was interested through to influx all I have to do is add the devices in the MakerAPI instance and it works. I am assuming the logging app has some sort of filtering as part of the app?

In the spirit of keeping things "local," has anyone experimented with writing data to a Synology Office spreadsheet?

Instead of writing to a Google Sheet, I'd like to try to write to my own NAS's office app. Didn't find any mentions of Synology Office while poking around.

Has anyone successfully used REST to read data into a logger?

I'm trying to connect PRTG (REST Sensor v2) to the hubitat and read temperature/moisture data from a sensor, but I'm not sure of the correct syntax for extracting values from the hubitat device JSON web page. (Hubatit Maker API).

Alternatively, I'm looking to send data to influxDB but a cloud instance, has anyone configured the influxdb app to connect to a cloud instance?

Do you have any links for the devices / platforms you are referring to?

And Welcome to the Community :slight_smile:

I have regular zwave and zigbee devices connected to Hubitat. I want to store this data in PRTG which we are using for network monitoring. - PRTG Network Monitor ยป All-In-One Network Monitoring Software - they have standard REST sensor that I can link to hubitat. I just need to know the syntax of the output of hubitat Maker API for JSON calls. Maker API - Hubitat Documentation

Grafana Cloud | Grafana Labs is the host for cloud based historian.