Let's say I take a ton of measurements of the diameter of a marble with a 0.001" resolution micrometer, and I calculate the standard deviation of the sample set. Would the standard deviation be a better measure of uncertainty than the resolution of the instrument?
2026-04-01 17:05:08.1775063108
On
Which is a better measure of uncertainty: standard deviation or resolution uncertainty?
16.8k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
There are 2 best solutions below
0
On
The standard deviation tells you how precise the measurement is. It tells you what the energy of the measurement error is.

If you have actual bounds on the magnitude of the measurement error, you can use interval arithmetic. I love interval arithmetic, but it can produce over-pessimistic bounds.
The resolution on the meter only tells you that you can't clearly resolve measurements observed between tick marks. It does not tell you how consistently the results are recorded. Repeated measurement of an object of a known length will directly measure the recording error variability through the estimated standard deviation. This will not be exact. It is subject to sampling variability (i.e. it is a sample estimate of a population parameter).
The standard deviation is a measure of the accuracy which includes meaasurement precision and bias in recording whereas instrument resolution only measures precision. So it can understate accuracy. The sample variance will be statistically an unbiased estimate of variance which does measure accuracy.