Proposal: Updated and improved battery reporting

This proposal is for a standardized, user adjustable, battery percentage reporting algorithm, to be used with all battery powered devices supported by HE. For existing drivers the updated logic would be added as time permits. Consider this a starting point, feel free to improve it.

As many of us know, battery percentage is perhaps the most arbitrary unreliable piece of device information, yet almost all of us rely upon it to keep our battery powered devices up and running. For example: I have a (non critical) contact sensor working at 0% or the past 3 months, and conversely many of us, including me, had devices fail at 25% and higher.


  • Display the device's reported event voltage in a device "Current States" as battery_volts, and in the event message.
    battery_volts: 5.6
    "Iris V3 Keypad battery was 71% 5.6 volts"

  • User adjustable Max and Min voltage fields allow for devices that don't conform to pre defined norms, standards, or manufacturer specifications

  • Battery type selection: Alkaline, Lithium, Rechargeable, SilverOxide. When changed, Max and Min voltage values are modified

Example of a standardized percentage formula taken from a keypad driver supporting multiple device types, reporting voltages 3 or 6 volts

	results = createEvent(getBatteryResult(Integer.parseInt(descMap.value, 16)))

  private getBatteryResult(rawValue) {
	def linkText = getLinkText(device)
	def result = [name: 'battery']
	def volts = rawValue / 10
	def excessVolts=3.5
	def maxVolts=3.0
	def minVolts=2.5
	def batteries = 2
	if (,3)!='340')	//UEI and Iris V3 use 4AA batteries, 6volts
		batteries = 4
	switch (BatteryType)
		case "Rechargeable":
			excessVolts = (1.35 * batteries)
    		maxVolts = (1.2 * batteries)
		minVolts = (1.0 * batteries)
		case "Lithium":
			excessVolts = (1.8 * batteries)
			maxVolts = (1.7 * batteries) 
		minVolts = (1.1 * batteries)
		default:					//assumes alkaline
			excessVolts=(1.75 * batteries)
			maxVolts= (1.5 * batteries)
			minVolts=(1.25 * batteries)
if (volts > excessVolts)
	result.descriptionText = "${linkText} battery voltage: $volts, exceeds max voltage: $excessVolts"
	result.value = Math.round(((volts * 100) / maxVolts))
	def pct = (volts - minVolts) / (maxVolts - minVolts)
	result.value = Math.min(100, Math.round(pct * 100))
	result.descriptionText = "${linkText} battery was ${result.value}% $volts volts"
return result

You've proposed a very laudable goal and I certainly like it to be that easy.

However I believe the majority of the issue is in the device firmware itself. A number of my battery devices provide "raw" voltage and the driver allows me to set the min and max voltages as numbers (i.e. min = 2.4 and max =2.7 or whatever I choose). I still get devices that report extremely low voltage yet are still working normally.

I expect one day these battery life predictions will become better but for me I simply use "Device Activity Check" to let me know if anything doesn't report at least once a day. And I have no Critical devices that if gone for 1 day would be an issue.



Without knowing the reported battery voltage, and in some cases the percentage formula, there is no way to effectively set the minimum voltage to generate a meaningful percentage.

If that is a user driver it should be easy to find the code that calculates the percentage.


I do have access to the code for one device. The calculation results in the min voltage = 0% and the max voltage = 100%.

  • I modified the code to output raw volts.
  • I removed the battery and measured its voltage with an accurate voltmeter.

The battery measure with a bench meter = 2.999 volts.
At the same time the device reported the voltage as 3.104 volts.

So with a working voltage span of only about 1/2 volt, the measurement error is 20%

In addition, it is likely the 2.999 volts I measured was higher that the battery might be supplying when the sensor wakes up to take the measurement and send the data.

On top of all this, I would guess sensors vary as to what their actual minimum operating voltage is. I'll also bet the min voltage (and likely the measurement) vary with temperature.
All this ignores the battery to battery variation.

1 Like

In the real world it's the voltage reported by the device that matters. It will likely vary from what is reported by a voltage meter, and will be impacted by temperature.

When a device stops working and the actual battery voltage is known, then it can be set as minVolts, most likely making subsequent percentage calculations more accurate.

This proposal is designed to give the user some control, provide more information, and better percentage calculations. However, the percentage calculation will never be 100% accurate.

Nor reliable enough for sure battery alerting. 'Tis a noble objective, but will fall short in reality and dead batteries will continue to be a plague. But which one will fail? No way to know.

From my perspective, an illusory belief that battery reporting works when it doesn't actually work reliably is even worse than it not working reliably.

Nothing will eliminate devices with dead batteries. However, providing actual battery voltage and some user control over the percentage calculations will help.

Will help what? Knowing when to replace batteries?

Yes, and it will never be perfect.

IMO not worth the effort. When something will always fail unpredictably, investing effort to make it better is not well conceived -- a bit of trying to turn a sow's ear into a silk purse. The better strategy, at least for these cheap Z-Wave and Zigbee devices, is the one described above by @JohnRob.

Actually someone had done that. I believe it was Cambridge Consultants. They also made a lead balloon. I haven't heard anything about the screen door in a submarine. Likely it was labeled as classified.

1 Like

If you were an engineer working for me and presented this proposal my first reaction is "show me the data".
It would be good to prove or disprove you proposal but monitoring a few sensors from new battery to dead battery.

I'm not trying to be negative here but I started down the same road but gave up when I had multiple sensors (same brand, model and purchased and the same time) show different battery voltages upon installation of new batteries (again the same brand batteries).

I disagree with your conclusion. I feel battery percentage is generally useless.

Knowing the battery voltage gives users another tool to monitor their device batteries. The basic effort is minimal, start by changing the battery event message and adding voltage as a "Current State". A more sophisticated approach adds user control and battery type preference fields.

That's the reason for allowing optional user input on min and max values.

This device battery life issue is one of serious concern and worth the effort to folks in the Security Industry.

One way some of them have addressed it is to prompt a notification if the device had not been heard from in polling rounds or triggered activity in 3, 6, 12, 24 in some cases or fixed by UL installation standards in others. In fact, this rises to the level of not being able to let you ARM the system until the wayward device is visible again. Override with overt and deliberate exclusion of the device is the only way around it. In other words, you can't help not being advised that 100% of your configured system devices are not at a ready state.

If we can't trust the voltages to relate to some % life, if we can't trust firmware and device reporting, if we can't trust the Zigbee standards body or the folks that keep the keys to Zwave to address this....then perhaps it is an opportunity to be "THE HUB" that outshines the rest by finding a way to improve installation reliability through surrogate solutions.

The "I haven't heard from this device in X hours" is a good start.

I'm also thinking that it wouldn't be impossible to collect relevant data that impacts the battery life of a specific device, e.g. temperature extremes, usage, age since last change, AND the voltage & % left (for whatever they are worth) and with that information suggest a battery swap schedule for the whole network of devices.

I could see a device table showing Green - Yellow - Red accordingly. OK, granted this is ambitious...but either we get to where we can be informed about this most critical point of vulnerability in an instalation...or we accept it as a weakness.

Quite frankly, I bet a lot of folk would be happy with the hub's notification: " I haven't heard from this device in X hours, and/or....when I heard from this device it was sporadic, wasn't normal/complete/translatable...or some other symptom often related to a failing battery. "

I know there are Apps out there that work towards this end. But it shouldn't be an independently maintained Community App that has to be relied upon to address this critical system/device vulnerability.

Posting after being reminded of this when changing a battery on a SmartThings Multi-Purpose Sensor V5 that I just today realized was out of touch after a couple days. Yeah, last entry showed 100% on the battery.

1 Like

My first thought is that the Hubitat Hub is not a security device. In my home I have a traditional security device with hardwired sensors. And I have a Hubitat hub to control my lights etc.
For my battery devices I use "Device Activity Check App" which notifies you if an subscribed device doesn't respond in hours:minutes. (where tbd is definable).

As for predicting battery life status by monitoring voltage, that may be a valid approach, however one cannot expect much voltage measurement accuracy from a $9 device.
Add to that are variations in battery brands, how fresh are the battery when installed, variations in drawing of different devices of the same model number.

If battery status is so important to a system the batteries should be replaced when the estimated battery life is at 50%.

I take your point...

But, think about all the applications you have read about in the Community and I bet you can come up with plenty that have progressively become just as important to people, small companies, and farms as their security systems. Maybe more so because they manage/oversee things wherein the implementer wants to trust the oversight 24/7/365.

I have a separate Security System as well..but I also have HE paired devices that aid it.....especially when it comes to mitigating risk and/or allowing preemptive protective action (think heat, cold, water, gas, electrical, access, etc.) I would argue that the desired merging of Security System capability (including it's standardized level of reliability) together with Home/Facility Automation & Oversight is already upon us.

As far as replacing batteries on some estimation of half life....all I can suggest is that this is 2021 not 1991. We should be better at automating all aspects of automation, including device battery oversight.

1 Like

Things like this make me wish someone plugged in to Silicon Labs(Z-wave) or the Zigbee Alliance would pop up and say....

"yeah, this is a recognized weakness and improving on it with hardware & software standards is in the mill".

I don't believe this has anything to do with the "alliances", this is hardware/software specific only, nothing to do with the transmit protocol. The biggest problem with battery reporting is and always will be with the way a Lithium battery discharges compared to non lithium batteries, if you look at a Lithium discharge curb, you will notice that it does not very that much in time but then drops off real fast at the end of it's capacity. To be able to measure this accurately, the hardware and software that reads the voltage needs to be very accurate and is not viable for most consumer level non critical electronics. Having different discharge curbs in the firmware of the device and if the user could select the profile based on the chemistry used, would give a much better indication IF it is correctly implemented.

But this does not guaranty that buying higher priced sensors will give you better battery reporting, it also depends on how the manufacturer implements the ADC, how and when they read the value of the voltage, etc.

1 Like

You're right.

I guess it was a hopeful thought that these groups could evolve and reach beyond their protocol purpose saying, "if you report any of these xyz standard parameters, in this example say battery or voltage, then this is the acceptable way to do that and this is the acceptable accuracy (i.e. assuming your discharge curves in the battery case)".

Consistency and reliability is often why folks use to cherish proprietary systems & hardware... yet we have turned the corner to expecting things to be just as reliable and consistent (if not more) through open-ness and common standards.

1 Like

Things like this make me wish someone plugged in to Silicon Labs(Z-wave) or the Zigbee Alliance would pop up and say...

You're not hearing me.

  1. I am in the process of purchasing a few more zigbee contact sensors. They will cost me $11.80 including shipping. These are not "security grade" sensors. What premium performance would you expect?

"yeah, this is a recognized weakness and improving on it with hardware & software standards is in the mill" .

  1. Not going to happen. The alliances control the protocol not the hardware (nor the software). I certainly don't want them forcing premium performance on my $11.80 sensor.

  2. Not all battery brands last the same amount of time. In general the "big" names are better that the unknown brands, but that is only "in general". As above what do you expect from a $0.50 battery?

I bet you can come up with plenty that have progressively become just as important to people, small companies, and farms as their security systems.

They should not use home automation systems for security. Z-Wave nor Zigbee has the security "hardness" of a alarm grade sensor. I may be wrong here but I don't believe security systems us a mesh.