I have two functions, one derived via software, and we can call it the exact function, $f_{exact}$. The other is a result I got through hardware, and we can call it the approximation, $f_{approx}$.
Both functions are discrete, over 128 points, so it wouldn't make sense to try and derive an explicit expression.
Ok, so I want to calculate the "error" of the approximation, namely how different the approx function is from the exact function. What would be the best choice here to showcase the difference? I can calculate the maximum difference of the functions:
$$m = \max\limits_{i\in[0,127]}{f_{approx}(i)-f_{exact}(i)}$$
But I'm not sure if that's the best way to go. Any suggestions?