I've got several devices that are now reporting temperature and think it'd be a good idea to attempt to get them all calibrated where the values are closer to accurate. I really don't want to drop several hundred bucks on an accurate meter like the Fluke (https://smile.amazon.com/dp/B005T5JW2S/). It would be nice however to get them within a few degrees of each other. I currently have one of these cheap infrared temperature guns:
That’s pretty much what I did - used a cheap laser thermometer.
I also noted that my cheap thermometer would read differently depending on how close or far away I was so I tried to maintain a common distance away during my calibration.
I did three passes just to average out the data.
I remember reading that the infrared gun I have is most accurate at around 14 inches. I'll try to keep it around that distance. That's a good idea to make multiple passes.
I used a spare sensor and put it on top of my thermostat. Calibrated the spare to the thermostat, then moved it to where every other sensor is and calibrated each of those sensors. They might all be off, but at least they are all off by the same amount.
Every month or two I will relocate each sensor (5 total) to a spot near my wall mount thermostat, let them set for a ~1/2 hour and use that as my reference set point.
But I'm definitely going to try the inferred gun on the next go round to see if there is temperature differential, and if acceptable use that from now on.
but you can use non professional ones too. You always need 2 because the calibration unit could be slightly out of range too. If you measure 3 temperatures (1 sensor, 2 references), you'll take the closest values (and not the average) of the 3 measures.
But don't be paranoid. The temperature in a room is far from the equilibrium. My motion/temperature sensor (on the top of my living room, against an external wall, old house, poor insulation) is about 3 degrees different from the one located in the middle of the room... But we live in the living, not against the living wall.
I agree with the use of a regular temperature sensor like the probe above or even a bulb type.
IMHO the IR thermometers are not a good choice as their readings are effected by a variety of factors
including the color of the item being measured. i.e. silver, black, white, green etc.
My method is:
With the probe sensor:
Mix crushed ice and water, more water than ice to make an "ice bath". These were used for thermocouple reference junctions in the old days. It will be 0 °C +/- a few 0.01's
Measure the ice bath with the probe mixing slowly as you make this measurement. Note the reading.
Do a similar measurement but with boiling water. This is not as accurate as the ice bath so here you want to be "close" a few degrees one way or the other is enough to determine if the thermometer is way off.
Now for the comparison:
With a small cardboard box 6" cube to 10" cube or similar, put the box on an insulated surface (i.e. putting the box on your granite counter is likely not the best place) where there is no air currents (i.e. from A/C vents, heating vents, etc) allow both to stop changing, probably 10 to 15 minutes. Take readings of both and you have your offset.
This is the best you can do without very expensive ($$$$$) equipment. And it is pretty good, the ice bath is very accurate and in most digital thermometers the error is an offset at all temperatures as opposed to a slope change.
Thanks for the suggestions. It reminded me that I have a Thermopop too that could be used. It's been tested by 3rd parties to be within +/- 2F. That's good enough for my purposes.
Another way to get the temperature with the IR gun getting around the material issue is to use the same item to read. For example I have a [ahem] "few" spare Iris sensors. I could put them beside the live sensors (without powering them) and use the backs of the casing as a measuring material. That should be fairly consistent.
I can't put the temperature sensors on the home thermostat as there's not one in this old house. Cooling is by window AC units and heat is all standalone electrical units. One of the things I'm doing is adding a Hampton Bay ceiling fan in the living room (my office) to test integrating with Hubitat to see if I can circulate the heat downward and cut the amount of time I need to run the heaters. The plan is to use multiple sensors at different heights (floor to nearly the ceiling) to judge when to use the fan. If this works well, I'll add another fan to the bedroom which is the only other room where I spend a considerable amount of time and need to be concerned about maintaining a comfortable temperature.
People with data overload either have too slow of computer or not enough drive space. Admittedly I am a data hog. I'm also data paranoid having backups of my backups of my backups. And then a copy of those backups are synced offsite.
I'm in the same boat, but I'm more interested in figuring out the difference in temps between rooms than I am the actual temp.
When my thermostat is set to 70, two of the bedrooms are wildly different temps. Maybe 5 degrees apart, and our finished basement is at least 10 degrees colder.
none of my Iris temp sensors are connected to the thermostat, so that's one less thing to worry about.
Would I be ok just putting my 9 temp sensors on a table on the same room and just changing their offset until they are all the same?
What I did was take 1 temp sensor, and put it right next to my thermostat to create a "standard" then I took that standard and moved it to each of my other temps sensors are and let them sit for 3 hours and calibrated each sensor to the standard.
I don't know where you are testing but I would recommend they all be put into a cardboard box to eliminate air currents and to reduce the effects of one thermometer responding faster than another.