Calculating accuracy from relative Error

423 Views Asked by At

Assume for an Estimation and measured value $$ E(x) = 6\\ M(x) = 2 $$ For calculating the relative error it is used $$ E_r = \biggl|\frac{E(x) - M(x)}{M(x)}\biggl| = \frac{4}{2} $$ For calculating the accuracy I have gotten the formula $$A(x) = |1-E_r(x)|$$ Which in this case would result in $$A(x)=100\%$$ Which makes absolutely no sense. Where is the error in the accuracy calculation here?

1

There are 1 best solutions below

0
On

The accuracy formula is intended for small relative errors. If you know the relative error is $0.1$, the accuracy formula says that the accuracy is $90\%$. That in itself is an approximation, ignoring the $1\%$ difference between $1+x$ and $1-\frac 1x$. When you say an estimate is $50\%$ accurate, do you mean it is between $0.5$ and $1.5$ times the true value, or between $0.5$ and $2.0$ times the true value? The formula makes no sense when the relative error exceeds $1$. I would take off the absolute value bars to get a negative answer in that case, which might warn you what is going on.