Bump - Hoping someone familiar with this can help??
Did you get this working. If you dont mind me asking what is the reason for this over just using something like InfluxDB or webcore to collect the data and generate graphs. Those other options are fairly well vetted at this point.
No, I've not gotten it working.
The reason is to get long term data onto a google sheet to keep the log (I'd like to keep the max daily relative humidity at the ridge of my encapsulated attic over many years for example), and visualize it. HE generates all the data but just does not have a good way to visualize and reference it.
My understanding is InfluxDB requires another device constantly on, which I don't have, and I have no idea about webcore.
Are you saying that if I download the Hubitat Webcore there is a way to get it to send data to google sheets?
Yea.. You are not the first to bring this up. Especially over a long time.
As far as InfluxDB goes the answer is a little complicated. You can use InfluxDB and Grafana completely from the cloud. The problem is that there are limits and one of them is the retention period. So in your case InfluxDB would have to be run locally and be a always on Device. The nice thing is that though the device is always on it can be something like a Raspberry Pi3 or 4 and doesn't even need a ton of memory. Once you run it locally you set the retention period or never set one and keep the data forever. The the limit is storage at that point. I would get a raspberry Pi 5 and then a NVME hat. You won't need a huge NVME drive for it. Then on the raspberry Pi also load Grafana and now you are all set with a good DB for time series data and Great visualization tool.
The nice thing about webcore is that it integrated the functionality of a app called Hubigraphs after the original developer left the platform. That tool keeps your event data locally and allows for visualizations completely on the Hub. I can't advise much with Webcore as i don't use it, but i know allot of folks use it and like it.
The big problem with long term storage is that allot of times the data you genreate on a daily basis isn't what you want to keep for Long term reports. You want a derivative of it. This means at some point you want to do some kind of data reduction. That just means taking the mas data produced throught the day and analyzing it to only keep what you need for those long term data points. A good way to think of this can be related to your example. All of the tools for this kind of think trigger based on the sensor value changing. So you would likely have a full day of Relative Humidity values from that sensor. To reduce that you would have a query that analyzes all of that data and then only storge the max value for the days in a seperate table. That is just something to consider. You may find the environment is so small it doesn't matter to keep data forever.
Thank you. Very helpful explanation.
Unfortunatley, I don't have a Raspberry Pi or any other always on device, and one of the reasons I got a Hubitat was so I would not need to mess with setting up some device that is not plug & play (as required for HA). So probably not going there.
I use several Hubigraph visulizations. Even though it is no longer supported it still works. However it is not user friendly, and seeing the data is not straightforward. I also tried to use Webcore once for the supported and perhaps further developed Hubigraph charts, but could not figure it out.
If I can just figure out how to get my data to export to a google sheet automatically, I'll be all set. There are apparently several users who have done this. I hope one of them drops by to help...
Understood. If you ever change your mind and want to give it a try feel free to hit me up for help. I have a tutorial on setting it all up and even a raspberry pi image with influxdb and grafana and other stuff to help someone get started.
I hope you get this figured out.
Only when I did it using the Chat GPT method
I would just go back and put the scripts in again, and authorize it. Google is just checking to make sure you know you are saving a new script and it can't verify what it is. It is pretty straightforward with what it is doing if you look at the Google Developer pages.
I wrote the delete script, but I only modified the original add script a bit, the one that I got from the SmartThings forum post.