I have roughly 39 data points that are values for % error. I have been researching and found that it is not correct to simply take the average of the % errors. Is this correct? If so, what is the correct way to take the average for % error values? All of the data values are weighted the same. Thanks!
2026-03-31 05:00:51.1774933251
On
How to take the average for percent error?
5.3k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
There are 2 best solutions below
3
On
I assume that you have given the errors in one direction and not the absolute error.
Let $r_i$ the deviation from the real data point in decimal notation.
Then the average deviation is
$$d=\sqrt[n]{(1+r_1)\cdot (1+r_2)\cdot (1+r_3)\cdot \ldots\cdot (1+r_n)}-1$$
$r_i$ can be negative and positive. For instance, if the real value is 120 and the deviation is $-12$, then $r_1=\frac{-12}{120}=-0.1$ and consequently $1+r_1=0.9$.
To get the value in percentage you have to multiply $d$ by $100\%$.
You certainly can average the percent error values. That is a well defined operation. As Dilbert says, you can multiply them, too. Whether or not it expresses what you want can be very subtle. You are probably remembering the fact that averaging the percentage errors will not give the same result as dividing the average error by the average true value. If you explain carefully what you want to express, it will lead to the correct way to compute it.