Taylor polynomial - error doesn't converge always to $0$

60 Views Asked by At

I want to calculate the Taylor polynomial of order $n$ for the funktion $ f(x) = \frac{1}{ 1−x}$ for $x_0=0$ and $0 < x < 1$ and the remainder $R_n$.

I have found that \begin{equation*}P_{0,n}(x)=\sum_{k=0}^n\frac{f^{(k)}(0)\cdot x^k}{k!}=\sum_{k=0}^n x^k \end{equation*}

Is the remainder \begin{equation*}R_n=\frac{\frac{(n+1)!}{(1-\xi)^{n+2}}}{(n+1)!}\cdot (x-0)^n=\frac{x^n}{(1-\xi)^{n+2}}\end{equation*} ? Or do we have to use the fomrula with the integral?

In some notes I read that in general the remainder doesn't converge to $0$ for all $\xi\in (0,x)$. Can you give me an example for that?

1

There are 1 best solutions below

4
On

The remainder for Taylor's theorem isn't the one you found.

The Lagrange form of the remainder is $R_n(x) = \frac{f^{(n+1)}(\xi)}{(n+1)!}(x-a)^{n+1} = \frac{x^{n+1}}{(n+1)(1-\xi)^{n+1}}$

For this particular $f$, the remainder is easy to compute exactly as $$f(x)-\sum_{k=0}^n x^k=\frac{1}{1-x}-\sum_{k=0}^n x^k=\frac{x^{n+1}}{1-x}.$$

Therefore $R_n(x)=\frac{x^{n+1}}{1-x}$.