How does the Taylor Series converge at all points for certain functions

600 Views Asked by At

The way my professor defined Taylor polynomials is: the $n^{th}$ degree Taylor polynomial $p(x)$ of $f(x)$ is a polynomial that satisfies $\lim_{x\to 0}{f(x)-p(x) \over x^n} = 0$. This is actually the little-o notation $o(x^n)$, which means $(f(x)-p(x)) \ll x^n$ as $x$ approaches $0$. From this I have got the intuition that Taylor Polynomials work only for $|x| < 1$ because $x^n$ gets smaller as $n$ gets bigger only when $|x| < 1$. And the textbook seemed to agree with my intuition, because the textbook says “Taylor polynomial near the origin” (probably implying $|x| < 1$).

Since Taylor Series is basically Taylor polynomial with $n\to\infty$, I intuitively thought that the Taylor Series would also only converge to the function it represents in the interval $(-1, 1)$.

For example, in the case of $1\over1-x$, it is well known that the Taylor series only converges at $|x| < 1 $.

However, all of a sudden, the textbook says that the Taylor series of $\cos x$ converges for all real $x$. It confused me because previously I thought the Taylor series would only work for $|x|<1$. Now, I know that the Taylor Series is defined like this: $$ f(x) = Tf(x) \Leftrightarrow \lim_{n\to\infty}R_{n}f(x) = 0 $$

And I know how to get the maximum of Taylor Remainder for $\cos x$ using Taylor's Theorem, and I know that the limit of that Taylor Remainder is $0$ for all real $x$, which makes the Taylor Series of $cosx$ converge to $\cos x$, pointwise. However, I just can't get why my initial intuition is wrong (why taylor series converges for all $x$ for certain functions, like $\cos x$, also $\sin x$ and $e^x$, etc.)

2

There are 2 best solutions below

0
On BEST ANSWER

Actually, things may go wrong in $(-1,1)$. For instance, the Taylor series centered at $0$ of $f(x)=\frac1{1-nx}$ only converges to $f(x)$ on $\left(-\frac1n,\frac1n\right)$. And if$$f(x)=\begin{cases}e^{-1/x^2}&\text{ if }x\ne0\\0&\text{ if }x=0,\end{cases}$$then the Taylor series of $f$ only converges to $f(x)$ if $x=0$.

On the other hand, yes, Taylor series centered at $0$ are made to converge to $f(x)$ near $0$. But that's no reason to expect that they don't converge to $f(x)$ when $x$ is way from $0$. That would be like expecting that a non-constant power series $a_0+a_1x+a_2x^2+\cdots$ takes larger and larger values as the distance from $x$ to $0$. That happens often, but $1-\frac1{2!}x^2+\frac1{4!}x^4-\cdots=\cos(x)$, which is bounded.

0
On

The first problem was concluding wrongly that because the series of polynomials is developed near the origin, then it can only be valid near the origin. But there is no prior reason to suppose this. Yes, the polynomials approximate the functions arbitrarily closely as you approach the origin, but this does not mean that they also do not for points far from the origin.

In other words, you have gone from $a\implies b$ to $\tilde a\implies \tilde b,$ which you can see to be clearly false, identically. That is, it is not necessarily true for all $a,\,b.$

Since you already know why the series for entire functions like $\cos x$ converges everywhere (as you explain towards the end of your post), you should now see where your original intuition (I would say erroneous belief) misled you.

Hope this helped!