How can I characterise the error of an interpolated surface?

48 Views Asked by At

I am writing a program in which I can interpolate and display a surface by kernel interpolation. Lets say I interpolate a function $f(x)$ by the function $f^*(x)$. Clearly the error at any given point is merely $|f(x) - f^*(x)|$, but how can I characterise the overall or average error in some range?

3

There are 3 best solutions below

0
On

You can average the error over that region over a reasonable size sample and that will get you a characteristic.

0
On

If I understand you correctly, you wish to also take into consideration the values in between interpolation points. If not, simply sum up the errors of each point (and normalize by the number of points if you want to average).

One way to compute the error $E$ is to integrate the difference over your domain $\Omega$, i.e.

$$E_1 = \int_\Omega |f(x)-f^*(x)|\,\mathrm{d}x.$$

This is the $L^1$-error. In general the $L^p$-error is $$E_p = \left(\int_\Omega |f(x)-f^*(x)|^p\,\mathrm{d}x\right)^{1/p}.$$

Sometimes the maximum error is also used, i.e. the supremum norm. In my experience the most common $L^p$-norms used are $1$, $2$ and $\infty$ (supremum norm).

Note that you may have to estimate these integrals numerically. There are many ways to do this, but this should be answered in a separate question if necessary.

1
On

Given that you are using a finite number of interpolation points, you can use sums rather than the integrals that Eff gave.