Taylor's Theorem: equality between the function and the infinite series

91 Views Asked by At

My understand of Taylor's theorem is:

Let $f:\mathbf{R} \to \mathbf{R}$ be $k+1$ times differentiable on the open interval with $f^{(k)}$ continuous on the closed interval between $a$ and $x$. Then $f(x) = P_k(x) + R_k(x)$

where

$P_k(x) = f(a) + f'(a)(x-a) + \frac{f''(a)}{2!}(x-a)^2 + \frac{f'''(a)}{3!}(x-a)^3 + \cdots + \frac{f^{(k)}(a)}{k!}(x-a)^k$

and

$R_k(x) = \frac{f^{(k+1)}(c)}{(k+1)!}(x-a)^{k+1}$ where $c$ is between $a$ and $x$.

At least this is one form of the theorem.

So if we apply this to something like $f(x)=e^x$ at $a=0,$ we get

$e^x = 1 + x + \frac{x^2}{2} + R_2(x)$

or

$e^x = 1 + x + \frac{x^2}{2} + + \frac{x^3}{3} + R_3(x)$

etc...

But I often see the equality $e^x$ with the infinite series $1+ x + \frac{x^2}{2!} + \cdots$.

How is something like this achieved? Is it a consequence of Taylor's theorem? I suppose we need something like $\lim_{k \to \infty} R_k(x) = 0$ for all $x$?

2

There are 2 best solutions below

0
On

Taylor’s series are a special case of the more general case of Power series and you are correct, Taylor’s series is obtained by pointwise expansion which in general converge in an neighborhood of the expansion point but for some function it can converges for all x values (EG $e^x$, $\sin x$, $\cos x$...).

0
On

Hint:

Any infinite series $a+b+c+d+e+\cdots$ is just a shorter notation for the sequence of the sums $$a,a+b,a+b+c,a+b+c+d,a+b+c+d+e,\cdots.$$ So whenever you see a series, what you should in fact think of is the sequence of partial sums. This series converges or not as the associated sequence converges or not, and the limit of the sequence is called the sum (an extension of the word as usually used) of the series.