I'm studiyng for my exam of scientific computing, specifically to the subject of interpolation techniques, I'm stuck with this problem:
How many equally spaced nodes must be taken to interpolate the function $f(x) = \exp(x)$ on the interval $[-1,1]$, so that the interpolation error is $0.5 * 10^{-4}$?
I have no idea how to start or what should I do in the exercise. Any help?
You have to define the interpolation technique used. If you use linear interpolation between the nodes (which I suspect is what you are doing) you are using the first order Taylor's series. The error term is $\frac {f''(x)(x-a)^2}{2!}$. $x-a$ is at most half your node spacing, which you can choose to get the error low enough. What is the maximum value of $f''(x)$ on $[-1,1]$?