Absolute error in machine-precision terms.

779 Views Asked by At

I am trying to wrap my head around errors in floating point calculations. Let me denote absolute error as follows: $e = |x - \hat{x}|$, where $x$ is the exact number and $\hat{x}$ is its floating point representation. Assume round-to-nearest.

Now, the first thing I would like to understand is this inequation: the absolute error doesn't exceed half machine precision times the absolute value. Is it actually correct?

$$|x - \hat{x}| \leq \frac{\epsilon_1}{2}|\hat{x}|$$

Here $\epsilon_1$ denotes machine precision and, for IEEE754 float, is equal to $2^{-126} \times 0.0\dots01$. I understand it when the inequality looks like this $|x - \hat{x}| \leq \frac{\epsilon_1}{2}$ but where does the module on the right come from?

2

There are 2 best solutions below

0
On BEST ANSWER

Machine $\epsilon$ represents the relative, not absolute, error. You have a certain number $n$ of bits in the mantissa. The relative error is then about $2^{-n}$. If your mantissa is multiplied by $2^{100}$, the relative error stays the same, but the absolute error is multiplied by the same $2^{100}$. That is why you have the $\hat x$ on the right. The modulus is just in case $\hat x$ is negative

0
On

Machine Precission is defined as $\displaystyle{\color{#c00000}{\mbox{the smallest}}}$ $\displaystyle{x}$ $\displaystyle{\color{#c00000}{\mbox{such that}}}$ $\displaystyle{1 + x > 1}$.

The following $\verb*C++*$ function returns the Machine Precission:

template<class T>
T machinePrecission()
{
 static const T one=static_cast<T>(1),two=static_cast<T>(2);
 static T mp;

 for ( T x=one ; (one + x)>one ;) (mp=x),(x/=two);

 return mp;
}

Use as:

machinePrecission<float>();
machinePrecission<double>();
machinePrecission<long double>();

However, $\verb*C++*$ defines, in the library $\verb*<cfloat>*$, the constants $\verb*FLT_EPSILON*$, $\verb*DBL_EPSILON*$ and $\verb*LDBL_EPSILON*$ as

"$\tt\mbox{Difference between 1 and the least value greater than 1 that is}$ $\tt representable$".