Almost every sensor I have has a way to enter an offset to correct the temp reported to the actual temperature. With eveything being adjustable how do you get an actual temperture to start with?
Probably a question for the developer of the driver you are using. What driver are you using for your temperature devices?
Without calibrated scientific instrumentation, there is no way to get an "actual" temperature. Even then, that temperature will vary depending on the location in the house. When placing the sensors, I try to place on internal walls and away from direct sunlight and the flow from a AC/Heat vent. It does make a difference.
One method would be to select a portable sensor as your "truth" and walk it to the location of each other temperature sensor in your house (or outside). Wait 5 minutes, then calibrate the local sensor to the measurement from the truth device. That method will provide you with a good measurement throughout the house showing the variations from room to room.
Detailed as always @djgutheinz ![]()
As someone who does calibration for a living, this hurts so much.
It will give you somewhat consistent measurements throughout the house. The only way to get accurate measurements is to use equipment designed to calibrate these types of devices (such as an air chamber with circulated air and a device that has at most 25% of the uncertainty of the device you are testing). Even sitting within an inch of each other, you can have drastic temperature differences unless you are circulating air in an enclosed chamber.
"Good" measurements are relative. It really depends on the specs for the device. Any one of these where the MFR is claiming better than +/-2 degrees F is lying (and even that is a stretch).
And even when you calibrate in your method, you would have to re-calibrate periodically. Just like you have to calibrate the calibration tools periodically.
Yep,
This is me checking my grill thermometer (had to show the MFR that their probe was bad so that they would send me a replacement: Note the ~75 degree probe on the display). Ice bath using DI water and crushed ice made with DI water per ASTM E563 (tap can cause up to .5 degree error).
Wow!...I really like the official NASA insulated calibration cup and the reference to ASTM E563.... I did have to look up the word "Metrology" ![]()
I improved my home temperature sensors a while ago. I now use a triple back box, dht22 (housed alone in the single) with a pair of 5v fans/grilles which periodically move air over the sensor.
I calibrated this (successfully) using some cheap-■■■ battery powered portable sensors from amazon to determine a somewhat accurate way of measuring the temp within a room.
The initial issue was that temperature readings were poor, as there was little air movement overall. Plus the location was rubbish as the room had hot/cold spots. This meant the room was cold, heating kicked in, but took some time to cut back out again despite say, the couch being a bit too warm. Sensor was on the other side of the room.
Basically, air movement over the sensor was the key, along with a change in location.
Not to be confused with meteorology. We check their instrumentation. Their forecasts are on them. (one local military base used to call us to give them the local barometric pressure)
As to the cup, any insulated cup or thermos will work as long as it is clean. For holding 0.01 kelvin, you want a dewar flask per the ASTM though. Also of note, despite the seeming stability of an icebath, it isn't a reference in and of itself. It actually requires a calibrated thermometer, usually a high end PRT, in order to be official. Even with the best of DI water, the temperature will almost always be slightly off. (Most labs use triple point of water cells when it matters)
In any case, the original point was that the OP did not state how accurate he wanted his baseline. You would need to know that to know the best way to proceed. And, depending on the want, it might not even be possible given these are cheap devices. (In the measurement world, accuracy costs $$$..... way more than people would pay for these types of devices). You can convince yourself that they are accurate at a given temperature. But, once you go higher or lower than the temp you checked at, results can and will vary.... sometimes drastically.
I found all (most) of my temperature sensors were surprisingly similar in reading. And while my logical self knows 0.3 °F different makes no difference the OCD in me tries to find the difference and compensate.
Assuming no typical Smart temperature sensor can be tested in boiling water or an ice bath I put all my sensors in a small cardboard box ( a small insulated cooler would be better). Put the box in an area where the temperature is relatively stable and no direct sun or A/C air is blowing on it. Wait for about 2 hours and take a reading of each device. Looking at the results decide what makes the most sense to "standardize" on. I don't recommend a simple average because you my have 4 sensors grouped and one flyer.
This topic reminded me of this:
We had this problem with "what's the temperature outside." Accuweather says one thing, Apple Weather another, Weatherbug etc. So, I bought a Ecowitt weather station, put it in our backyard and we just go with that as the "true outdoor temperature" at our house. Inside I have so many devices that have the "inside temperature" that I have no idea what the temperature is.
Of course the issue is, the temperature anywhere (inside, outside, down the road) is never the same. It varies inside with heating or A/C not being in perfect balance. The sun hitting one room and not the other. In my case the upper floors are usually warmer than the lower floors, until the sun goes down then it is the opposite.
This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.



