Let me explain my scenario in which I need to calculate absolute error.
Lets say the X is the actual value. And X' is the value of X with some error 'e'. So X' = X + e'.
Lets say i = 1 to 10000. I have X(i) as well as X'(i). Now I want to know (also want to plot on graph) that How much X' differs from X. One way to do is to calculate Absolute error and plot it on the graph.
I want to know if there are other ways to see how much X differs from X' besides absolute error? and Which one will be more appropriate in my case as I have to plot it as a graph? And also easier to follow for those who will see that plotted graph?
Thanks! :)
Why don't you just calculate $X_i-X$ (signed differences) and then create a histogram of the signed differences? Also, just overlay the true X as a bright vertical line on the histogram. That should give people a good sense of the accuracy. If you feel it won't be too busy, you may also want to display (or maybe just calculate) the 2.5 and 97.5 percentiles, which will be the interval that captures 95% of the possible error.