Estimating temperature most accurately with different thermometers

415 Views Asked by At

I recently came upon this semi-opened ended question and wanted to think through it with you guys.

You have 5 measurements from 5 different thermometers, which are unbiased, but each with a different variance. Out of those measurements, how would you ensure that you measure the oven temperatures most accurately?

I personally would take each thermometer and take the temperature $X$ number of times, lets say 40 times (total of $40*5$ times). Then I would calculate the sample $\mu$ and sample $\sigma$ and simulate a normal distribution for each thermometer.

If I see that the measurements are normally distributed, I would use the thermometer with the lowest variance.

I feel like my answer might be too simple and I'm wondering if I should go about it a different way.

1

There are 1 best solutions below

0
On

You have $n$ measurements from $n$ different thermometers, which are unbiased, but each with a different variance. Out of those measurements, how would you ensure that you measure the oven temperatures most accurately ?

This is related to what is usually called data validation or data reconciliation (have a look here).

The most probable value is given by $$\widehat{T}=\left(\sum_{i=1}^n \frac {T_i}{\sigma_i^2}\right)\left(\sum_{i=1}^n \frac {1}{\sigma_i^2}\right)^{-1}$$