I have decided to run a little test of the accuracy of the temperature sensor/hygrometer. I currently have a WMR88 temperature sensor (Sensor 1) housed in a small wooden radiation screen and I have now added to the radiation screen a small cheap LCD temperature sensor/hygrometer from eBay which also has a Max/Min function and I shall now refer to as Sensor 2.
Observations so far:
- Sensor 2 seems to respond more rapidly to temperature changes than Sensor 1 (i.e. when being handled)
- Sensor 2 consistently reads temperatures 0.6 °C to 1.1 °C lower than Sensor 1 and humidity the same or slightly higher.
- Sensor 2 temperatures are closer to readings from the local official Met Office site
09/04/14 1200L: Sensor 1 = 14.6 °C Sensor 2 = 13.8 °C
09/04/14 2000L: Sensor 1 = 13.7 °C Sensor 2 = 12.7 °C
10/04/14 0800L: Sensor 1 = 5.6 °C Sensor 2 = 4.7 °C
10/04/14 2100L: Sensor 1 = 11.9 °C Sensor 2 = 11.0 °C
11/04/14 0700L: Sensor 1 = 9.5 °C Sensor 2 = 8.4 °C
11/04/14 1700L: Sensor 1 = 14.9 °C Sensor 2 = 14.1 °C
I know it's only a relatively small difference but it seems strange considering both sensors are in the same screen and about 1 inch apart. Could it be that the cheaper LCD sensor is more accurate?