When is it invalid to use taylor series expansion?

7.6k Views Asked by At

I was told by a fellow student that sometimes one cannot represent certain functions by a taylor series. I was also told that sometimes using a taylor series in a proof is invalid. Is any of this true? When is it invalid to use taylor series expansion?

Edit: By certain functions, I mean well behaved functions with nice properties, entire, countinuous, etc.

5

There are 5 best solutions below

5
On BEST ANSWER

Take a look at $$f(x)=\begin{cases}e^{-1/x^2}&x\ne0\\0&x=0\end{cases}$$ and take the Taylor expansion around $x=0$

$$f(x)=f(0)+f'(0)x+\frac{f''(0)}2x^2+\dots$$

Now, you will find out that

$$f(0)=0$$

$$f'(0)=0$$

$$f''(0)=0$$

$$etc.$$

So be applying Taylor's theorem here, one has

$$e^{-1/x^2}=0$$

which is nonsense.


Then, there is a second case. As Ethan Alwaise mentions, any series expansion makes no sense if it doesn't converge. Take, for example, the expansion of $\frac1{1-r}$ at $r=0$. Then consider that expansion for $r=2$. You should get something along the following lines:

$$\frac1{1-2}=1+2+4+8+\dots+2^n+\dots$$

which is also nonsense.

0
On

He is right, many functions don't have a Taylor series. For example $f(x)=\begin{cases}1 \text{ for } x\in \Bbb{Q}\\0\text{ else}\end{cases}$ is discontinuous everywhere so not differentiable everywhere and thus doesn't have a Taylor series.

1
On

To add to Zachary's answer, a function may have a Taylor series that is valid only within a restricted domain. For example, the function $f(x) = 1/(1 - x)$ is defined on $\mathbb{R} \setminus \{1\}$. It has a Taylor series $$f(x) = \sum_{n=0}^{\infty}x^n$$ on the interval $x \in (-1,1)$, but for $\vert x \vert \geq 1$, the above series diverges.

0
On

If by represents a function by its Taylor series you mean equal to its Taylor series, then your friend is right.

Indeed - the Taylor series at $0$ of $f(x)=\left\{\begin{array}{ll}e^{-1/x}\textrm{, if }x>0\\0\textrm{, otherwise}\end{array}\right.$ is zero but $f$ is nonzero. Howeover, $f$ is $\mathcal{C}^{\infty}$.

You are looking for analytic functions.

0
On

Let $N$ be an integer greater than zero and let $f$ be a $C^{N+1}$ function defined on an interval $[a,b]$ which contains zero. By the fundamental theorem of calculus, $$ f(x) = f(0) + \int_0^x f'(t) dt $$ Integrating by parts (choose $u = f'(t)$, $v = (t-x)$) we obtain $$ \begin{align} \int_0^x f'(t)dt &= \left. (t-x) f'(t) \right|_0^x - \int_0^x (t-x) f''(t) dt \\ &= x f'(0) - \int_0^x (t-x) f''(t) dt. \end{align} $$ Therefore $$ f(x) = f(0) + xf'(0) - \int_0^x (t-x) f''(t) dt. $$ We can continue inductively integrating the last term by parts to obtain $$ f(x) = \sum_{n=0}^N f^{(n)}(0) \frac{x^n}{n!} + R_N(x), $$ where the remainder term $R_N(x)$ is defined by $$ R_N(x) = \frac{(-1)^N}{N!} \int_0^x (t-x)^N f^{(N+1)}(t) dt. $$ This gives an exact computation of the error in the $N$th order Taylor approximation.

For a counter-example such as $f(x) = e^{-1/x}$ for $x > 0$ and $0$ otherwise, it is easy to see that for any $x > 0$ the remainder terms $R_N(x)$ diverge to infinity, and therefore the Taylor series does not converge to $f(x)$, even though the series itself converges.