Double precision: When is near-zero considered zero?

52 Views Asked by At

I came across an issue in code whereby I noticed the standard deviation, $\sigma$, of a set of constant values of e.g. 0.33333333 = 1/3 was not equal to zero. Rather, $\sigma$ was some small near-zero value.

Below is an example Excel spreadsheet I learned could easily replicate the issue after converting 1/3 to decimal form, and then calculating sigma.

Is there a generic mathematical function that one can use perhaps involving modulo, floor, ceiling, etc., on this value of $\sigma$ that will identify it as effectively being zero? Does one need to estimate the machine precision, $\epsilon$, for the code one is working with?

enter image description here