How to calculate relative error when values are close to zero or negative?

812 Views Asked by At

I'm trying to calculate the relative error of a machine learning model prediction. Normally I'd calculate the relative error this way

$RelError=\left | \frac{y-\hat{y}}{y} \right |$

However, since y can be zero or around zero, this formula causes problems. I changed it for the following:

$RelError=2\frac{\left |y-\hat{y} \right |}{\left | y \right |+ \left | \hat{y} \right |}$

Which has the advantage that it's not undetermined at zero nor it spikes around it. But I still have issues considering the error of negative values.

Consider the following scenario:

$y = -1 \\ \hat{y}=5 \\ RelError=2\frac{\left |-1-5 \right |}{\left | 1 \right |+ \left |5 \right |}= 2\frac{\left |-6 \right |}{\left | 6 \right |}=2$

This isn't exactly right for me: the error isn't "twice as much" as the original value of y.

And I don't know if that's the idea I want to give.

Do you have any suggestions?

Thank you

1

There are 1 best solutions below

0
On

I have a similar function for relative error. My definition is $$f(x,y) := z = \,|x-y|/(|x|+|y|). $$ In the case that $x$ and $y$ have opposite signs, this returns $1$ which is the maximum relative error. The downside is that if $x$ and $y$ have the same sign and are relatively close, then the function returns a value approximately one half of the usual relative error. The fix for that is to use $\,z(2-z)\,$ which is close to $\,2z\,$ if $\,z\,$ is close to zero and so the modified result is now close to the usual relative error.

A perhaps better function definition is $$ f(x,y) := |x-y|/\max(|x|,|y|). $$ This is much closer to the usual relative error but the denominator is modified so that the function is now symmetric in $\,x\,$ and $\,y.\,$ It is equal to the usual relative error in half of the cases.