Confusion about two claims concerning interpolating polynomials

82 Views Asked by At

Here

https://de.wikipedia.org/wiki/Polynominterpolation

under "Konvergenzverhalten" I found the following claim

If $f$ is analytic on $[a,b]$ and we have a sequence of sets of nodes at which $f$ is interpolated such that the maximal length approaches $0$ , then the sequence of interpolating polynomials uniformly converges to $f(x)$.

Here

https://en.wikipedia.org/wiki/Runge%27s_phenomenon

the function $$f(x)=\frac{1}{1+25x^2}$$ is given as an example that the interpolation error increases without bound if more and more equidistant nodes in the interval $[-1,1]$ are used to interpolate $f(x)$.

Is this not a contradiction to the claim above since $f(x)$ is analytic and the lengths of the sequence of sets of nodes converges to $0$ ?

What do I miss ?

1

There are 1 best solutions below

1
On BEST ANSWER

The claim is correct for any polynomial interpolations in an interval $[a,b]$ if the function is analytic in a complex ball of radius $r>3(b-a)/2$ and centered at $(a+b)/2$. When $r$ is a bit smaller convergence depends upon the choice of interpolation points. $r>(b-a)/2$ is in general necessary. This is not the case for the Cauchy function $f$ which has poles at $\pm i/5$ which is inside the corresponding ball. (I vaguely recall that you may even estimate the rate at which the interpolation diverges...)