Contradiction between Taylor's theorem and properties of the Taylor series of $f(x)=e^{-1/x^2}$?

640 Views Asked by At

It seems to me that there is a contradiction between the Taylor's theorem and the properties of the Taylor series of the function $f(x)=e^{-1/x^2}$ if $x\ne0$ and $f(x)=0$ if $x=0$.

On the one side, with the above extension $f$ is $C^\infty$ at $x=0$ and the Taylor series $TS[f]_0(x)$ of $f(x)$ at $x=0$ is identically null with a radius of convergence $R_c=\infty$, whereas $f(x)\ne 0$ for $x \ne 0$. As a consequence, $TS[f]_0(x)\ne f(x)$, $\forall x\ne0$.

On the other side, Taylor's theorem states that an expansion at the order $n$ gives: $f(x)=f(0)+\frac{f'(0)}{1!}x+\frac{f''(0)}{2!}x^2+...+\frac{f^{(n)}}{n!}x^n+h_n(x)x^n, \qquad$ with $\lim_{x\rightarrow0}h_n(x)=0$

So this means that we should get better approximations of $f(x)$ when $n$ increases. Which is then not the case in this particular example as increasing $n$ doesn't make a difference in the polynomial expansion.

Now, what is the problem here? Is it that we cannot apply Taylor's theorem in this particular case, and if so, why? Or is it that it is wrong to express the rest as $h_n(x)x^n$ with $\lim_{x\rightarrow0}h_n(x)=0$?

Alternatively, what special properties could have the functions $h_n(x)$ that doesn't make $h_n(x)x^n$ become smaller when $n$ increases?

Or am I missing something else?

3

There are 3 best solutions below

2
On BEST ANSWER

You seem to be confused about what the theorem you cite says.

Take $h_n(x) = f(x) x^{-n}= \begin{cases} 0 &\text{ if} x=0 \\ \frac{e^{-\frac{1}{x^2}}}{x^n} &\text{ otherwise.} \end{cases}$

It does satisfy $\lim_{x\to 0} h_n(x) = 0$, and all the properties promised by the theorem. There is no contradiction.


But it is utterly useless in order to approximate $f$ more and more finely by a sequence of polynomials forming the partial sums of a power series, "as $n$ grows." The whole point is exactly that: $f$ cannot be well approximated by such a family of polynomials around $0$. Trying to do so, no matter what degree $n$ you choose, you get a zero polynomial, and the remainder/error is... well, $f$.

0
On

Taylor's theorem is perfectly true, even in this case. Simply, the value of the function is entirely in its complementary term. This is one case where the Taylor's series does not converge to the value of the function.

$\mathcal C^\infty$ functions with a Taylor's series which converge to the value of a function are called analytic functions.

Thus you see analytic functions and $\mathcal C^\infty$ *functions are different notions, for functions of a real variable.

For functions of a complex variable, the situations is quite different: a function which is differentiable is ipso facto $\mathcal C^\infty$ and analytic.

11
On

I would agree that Taylor's theorem has a weak indication that the approximations should be improving. In particular, we can rephrase the theorem as follows:

Let $f_n$ denote the $n$th-order approximation of $f$. Then the error $e_n(x) = f(x) - f_n(x)$ satisfies $$ \lim_{x \to 0} \frac{e_n(x)}{x^n} = 0 $$ which is to say that $e_n(x) \to 0$ faster than $x^n$ (i.e. $e(x) = o(x^n)$).

In a more "typical" Taylor approximation, this pattern would indicate an improvement of $e_n$ as $n \to \infty$, which is to say that $e_n$ approaches zero progressively more quickly and that $e_n(x) \to 0$ as $n \to \infty$. We could see this with the Taylor series for $f(x) = e^x$, for example.

However, note that the theorem does not actually imply that this is the case. For our problem, we will always have $$ e_n(x) = f(x) $$ and it just so happens that $f(x) \to 0$ faster than $x^n$ for any $n$.