Wattage question

Hi guys, just a querie which may spark some conversation. I can't seem to find much information on most smart devices wattage when not in use. So basically I'm wondering how many Watts a typical smart bulb or outlet uses when not in use.

My real question, not to be rude or contentious is: do smart bulbs save money? I.e. having a motion rule which turns the bulb off when I leave the room is useful, however if it saves me 10 watts per day but my daily standby mode cost is 20 watts, then it's counter intuitive with regards saving the planet.

That said, I understand all of the other advantages such as convenience etc etc etc. Thoughts welcomed guys :+1:

1 Like

The draw for smart devices is immeasurable. Maybe if you added up hundreds of these you would get a few watts, but if these devices consumed 20W like your example, you would feel the warmth from them if you touch them. And things like light switches and turned off smart bulbs are not even a fraction of a degree warmer than their environment.

But as to whether they save money, probably not. Smart devices cost a lot more than a standard LED bulb or light switch, and that alone pretty much can't overcome the slight cost savings of turning off a LED when not in use. A 60W LED only costs about $10 per year if run 24/7.

However, as a nation or country, if everyone conserved and shut off unneeded lights, it would be a significant number.


I measured the draw of one of my Hue bulbs with a Kill-A-Watt once when the bulb was off, and it was around 0.5 W. I'm not sure how accurate these are when the wattage is that low, but it's still probably pretty close (and my casual Googling suggests this is about right; I see 0.4 W or even less with a quick search now, but this likely also depends on the exact model--and other brands are sure to be a bit different, too). This is a fraction of what they use when on, and I suspect that my smart home automations keep them turned off for longer than they'd be if I'd forget to do so manually. If you use a smart switch/dimmer with "dumb" bulbs, then this will likely work out even better in your favor since you only have one smart device drawing power when "off" instead of multiple smart bulbs doing the same thing (assuming you have more than one bulb controlled by that switch and that the standby power is similar).

My estimate for my own cicumstances is that it still probably better in the end, or at least close enough for me not to worry--and I enjoy the convenience of motion lighting and the fact that I rarely need to touch a button/switch. I also use home automation for more than lights, even if that is probably my favorite use, so I'd have the hub anyway. I suppose you could crunch some numbers using your real device consumption and an estimate of how many hours on vs. off if you cared to make a data-driven decision for your own circumstances. :slight_smile:


It will vary depending on the device. For example a Hue E26 (A19) 110 lumen has a maximum standby power draw of 0.2 W. A LIFX A19 1100W lamp lists a maximum standby power of less than 0.5 W. The point is, it isn’t much.

EDIT: Both lamps are 1100 lumens. I guess my thumbs went on vacation for a moment. Lol.

1 Like

I think with LED lights any significant savings has been severely reduced. Personally I don't think of automation as a cost savings method, purely convenience and comfort. I live in USA which may make a difference. I know the EU is much more energy conscious and Sweden and Norway even more so.

I just purchased a new refrigerator and I'll guess the efficiency of this newer appliance will far out weigh any automation savings I may have received.


OK, I will be the pedantic one. Watts is a unit of instantaneous power. Watt-hours is commonly used as the metric of consumed energy.

Here is an article from Sengled about this exact topic, in which they estimate $0.35 of annual standby operating costs.

I appreciate conservation, but also perspective. Bulbs are an easy target for folks, because we see them working and we were all told as kids to turn them off. They are a rounding error in my energy consumption (105F weather with central AC and pool pump - yes, both are automated). A single, common industrial motor dwarfs the consumption in my entire housing plan.


Totally agree. My wife usually goes nuts about my kids leaving the lights on but cranking up our electric baseboard heaters to Hawaii temperature without a second thought :man_shrugging:t2:


I think that's a universal law of life.

It is always easy to rationalize your own behavior, while refusing to overlook others'.

"Yeah, but..."
"At least it isn't as bad as XYZ!"
"This is different!"

All common rationalization phrases.


I come from Europe and I am old enough to have witnessed the two oil crises of the 1970s (yes, walking on the highways ...). So I have energy saving in my blood :wink:

That said, we're generally mistaken by marketing. Yes we have lamps and led lamps are far more economical that filament lamps. But most of our energy consumed is by heating and related use (water heating). The energy consumed to heat water in one month has no common link with energy consumed by led lamps.

So, if you want to really lower your energy consumed, it's better to optimize/replace your heating system. And speaking about this, it's better to insulate your home BEFORE adding/changing your heating unit.

That's because keeping energy [inside your house] can be made at lower cost than producing energy. A good basement/attic insulation will save more money than any led lamp !
OK, in the long run. I agree that purchase a led lamp costs less than a geothermal unit !

But very small things can also be done to save money.
Like insulating the outlets (yep, wind will travel through the tubes and holes), insulating the small gap under your main door, and replace your old garage door (calculate the surface and the ratio of heat going out) with a better insulated one.

Update: Here in Quebec, the electricity company promotes energy star appliances (fridges, washing machines, TV's). At first, strange idea for an electricity company to incitate people to consume less energy. I realized later that if people - locally- consume less energy, they can sell the surplus (at higher prices) to the USA and lower they own costs by not running an additional unit (even here it's mostly hydroelectric dams).

And, btw, you'll probably change your leds before they will go down. Just because you have a new technology around the corner... And you probably have a box, somewhere in your garage, full of CFL and 1st generation led (functionnal) lamps...

1 Like

I argued with my boss' wife about economy. She argued a lot about "you can't leave the [led] lamps open along the day... You have to see my electrical bill !".
I work in IT and we have a lot of computers, servers, AC units running 24/7. One day, I had an argument - again - with her about this "electricity bill". I explained to her, numbers in hand, what our computers are consuming versus those tiny 30 watts (lumen - about 5 w electrical)...
After this, she never argued with me about electricity bills :rofl:

I’m glad this question was asked by the original poster. I look forward to the answers.

A related question: How much does it cost to run the Hubitat hub, the Hue hub and the Lutron hub?

I’ve been wondering for a while how much money my home automation hobby is really costing me.

I always cringe when I hear people say they are into home automation to save money by making sure the lights get turned off. For the average home with led lighting, you are going to spend more money on the hardware than any electricity savings.

When gas prices rise, there are people who justify buying a motorcycle to save money. There is a natural need to present a rationalization, instead of just saying, "I wanted to" or "because it's fascinating to me". Marketing folks are well aware of the gap between our left brain and right brain.


Completely off topic, but that doesn't actually work in most areas - especially the oil industry. As people use less petroleum products, the manufacturers simply manufacturer less and keep less inventory. So you are still equally likely to have a crisis than you were before conserving.

Maybe not you personally, but society overall.

I'm not saying conservation isn't good, it is. But it does not fix issues like supply crises... At all.

Weird I haven't noticed a topic like this before, sure it has come up, but great topic @stueyhughes. No need to be so defensive :slight_smile: Now I just need to read everyone else's responses....

Very little. At 5V and 1A, that is at max 5W continuous. It is very likely much less than that. But lets say 5W.

Using this calculator, and plugging in 5W 24/7 , and assuming an average USA cost (mine is a bit higher) of 12c per KWH, you get about $5 per year. Reality is the hub isn't going full tilt all the time, you are really closer to $1-$2 per year, and maybe even less than that.


I measured the power (for UPS design)

The 5V runs about 0.3 amps normally then periodically jumps to 0.6A.
so if you assume a average current of 0.45A

Power at the Hub = 5 * 0.45 = 2.25 watts and if the AC adapter was 90% efficient
then the power from the wall = 2.5 Watts

So using @neonturbo 's calculations but dropping the wattage in 1/2 the cost is easily less than $1

Now an additional thought..... How many AC adapters are in your home? And how much electricity is being used by those adapters even if they are productive (i.e. charging a phone).

I started to count and gave up...

Just the "wall warts"

  • Cable TV adapter
  • Cordless phones
  • Computer speakers
  • Router
  • Hubitat
  • Distance sensor in my garage
  • Desk lamp
  • Wine opener
  • then I stopped counting....

OMG, I'm on 18.22pence per KW/H in the UK since our latest wholesale energy price increase. I work that out to be roughly 25cents per KW. Ouch👎

Mine is $0.09/kWh. :slight_smile:

That's why solar projects don't pay out in a reasonable time in Texas - even with the incentives.

Most people that do solar here are ones that think they will keep the house forever (so it will pay out... Eventually), or do it because it "just feels like a good thing to do" (Austin), as a 13-15 year break even doesn't work for most people.

And it is really only 13-15 year break even if you ignore the money you could have made investing the capital elsewhere... I've ran the numbers a dozen times. Solar proponents always tell me I'm wrong on the break even timeframe, until they go through the numbers with me... :wink:


Hmmm… current pricing for today in my area of Sweden, it will correlate to $0.27 and it was even higher yesterday!!

All in with fees, distribution and generation charges, I’m in the range of $0.24 per kWh up in the northeast area of the US. The joys of high cost of living states.

Solar at least probably has a slightly quicker break even period, although I’ve never done the math.

Download the Hubitat app