Function interpolation / approximation of function given by points only

268 Views Asked by At

I am wondering how can I calculate the error of an interpolation (Lagrange, Newton, spline) or approximation (trigonometric) for a function whose mathematical expression I don't know, but I have some points from its representation that I will use to calculate the form of the interpolation / approximation.

I was thinking of an error given by the difference between the:

->mean of the function's values (y's) in the given points that are used to calculate the others

and

->mean of the function's values (y's) in the calculated points

But this seems to be far from giving any relevant results.

1

There are 1 best solutions below

1
On BEST ANSWER

The usual way of doing this involves assuming a function space, say $C^{k}$, and then using this to prove that the error is bounded by a constant times the spacing between sample points. Unfortunately this constant is unknown unless an explicit form of the function is known.

I would just interpolate half the points, then calculate the error from the interpolant to the actual points. If you truly know nothing about your function, that's as good as you can do. If you can assume a function class then you can probably demonstrate that a particular interpolant has $O(h^{k})$ accuracy, for some $k$. Then, assume that the spacing between samples is halved, you are safe in assuming the error is $1/2^k$ of the error between the coarse interpolant.