Why does the formula for the uncertainty due to bias contain the square-root of 3?

62 Views Asked by At

I have been tasked with using standard gage blocks to calculate the uncertainty of a super micrometer and must consider the uncertainty due to bias in my calculation. The formula I am using for that looks as follows:

$u_{bias}=\frac{|x_{g}-x_{m}|}{\sqrt{3}}$

Where $x_g$ is the value of the gage block and $x_m$ is the average of the measurements I have taken.

What I gather from researching other sites is that bias is a type B uncertainty, to which we conventionally assign a rectangular distribution. The formula comes from the square root of variance of this distribution:

$u=\frac{a_+-a_-}{\sqrt{12}}$

where $a_+$ and $a_-$ are the upper and lower bounds of the distribution. If the expected value is assumed to be in the midpoint of the interval, then use the formula above for the uncertainty due to bias. If the difference between the upper and lower bounds is $2a$ then use

$u=\frac{a}{\sqrt{3}}$

THAT is the part I do not understand. It does not appear to be the case for the formula that I must use. For some reason I can't connect the dots and see how my formula is relevant to that $2a$ business.