Naive question - a long time ago I remember Sun pushing JavaOS for embedded systems, and possibly, thin clients. But I don't recall anything coming off it. Are JVMs inherently resource hungry?
To be clear, this is not a question about current or future HE hardware. I'm just trying to educate myself a little.
I have always been a fan of "throwing hardware at a problem". With the declining costs of memory, cpu, etc. it only makes sense to do that. (Compare the RPI 4 versus the RPI 3).
It seems like such a "no brainer", why wouldn't they do that?
Disclaimer: I've only used MQTT as a hobbyist, and never with any load testing.
My understanding of MQTT is that it's designed to be very, very lightweight, and this is also true for a broker. Yes, the broker receives and passes on thousands of messages per second, but it's not really doing anything with them. Only passing them on to listeners.
Versus when HE receives an MQTT message, it has to parse it and then it ripples out into all the various updates (including db updates) that result from the message.
Similar to how my mailman delivers thousands of letters each day, but I could not read and reply to more than a handful of them.
LOL all that said... I also am a fan of horsepower, and welcome all I can get my hands on.
It doesn't always work as expected. Some time ago I worked on a slot machine that would have an occasional stutter in the reel spin. Eventually the decision was made that memory was too constrained so more RAM was added to the system.
This made the problem worse, not better. It turns out that the stutter was caused by the JVM garbage collection. With more memory there could be more garbage that took longer to collect.
I hope there are more cpu and ram resources on the c7. I replaced a Vera Plus, which despite everything bad, responded very very quickly when an automation was triggered. The c5 isn't bad, but ~1-2s is a lot more noticeable than almost instantaneous. My doorbell button rule takes a couple of seconds to respond, so people press it a bunch of times.
My boss on the slot machine project wanted the developer's boxes to be close in power to the slot computer. That way when they were running locally they would get a better feel for the performance of the product. I speced new computers for everybody and it was rejected by corporate. It seems we had a "standard" build for all engineers. instead of a $2000 system we were forced to buy a $5500 system. So much for trying to save some money.
Back in the late 90s we upgraded our web server to two Sun E-250s and one E-450. Each programmer had a Sun Ray thin client connected into the big servers. It actually worked quite well and was quite a bit cheaper than having a dedicated desktop.
Ha! I wrote most of my doctoral thesis on a NeXT Cube. And it was printed on a NeXT laser. Here's the kicker, when I started my graduate work, I shared a close to original PC with several others. I remember "upgrading" it to 512K RAM
Edit - I remember being really excited when EGA monitors became available. Huge upgrade from MGA or CGA (wealthy snobs!). Compared to all that, Display Postscript on the NeXT was just amazing.
Hahaha you are the first person I have ever encountered to work on a NeXT. Steve Jobs in a classic moment.
Well my first pc was a 286/12 with a whopping 1Mbyte of ram. A huge upgrade from my commodore. I squeezed all the ram I could out of the first 640k using qram. And yes I had a Hercules card that gave me 16 shades of orange!!
I did my thesis (degree only - no doctorate here) on a 8088 with two 5.25 floppies. We had 20 sun machines in one lab ... rewriting pieces of the scheduler for lab assignments.
Sometimes, I miss those days. Life was simpler ...... also, I could fix my own car
P.S. Maybe the most comforting thing about Hubitat (outside of it being a great automation platform), is that I can relate age & experience-wise to a whole bunch of the fantastic folk on this community board.