Taylor's Theorem for $C^{\infty}$ functions

400 Views Asked by At

I am reading Tu's Introduction to Manifolds, the part where he derives a Taylor theorem (with remainder) for $C^{\infty}$ functions for points that belong in a set U that is star shaped with respect to point of expansion. The result is used to derive this expansion around $x=0$ for a function $f:\mathbb{R} \rightarrow \mathbb{R}$:

\begin{equation} f(x) = f(0) + g_1(0)x + g_2(0)x^2 + .. g_i(0)x^i + g_{i+1}(x)x^{i+1} \end{equation} where \begin{equation} g_k(0) = \frac{1}{k!} f^k(0), \hspace{2mm} k = 1,2,..i \end{equation} which basically is the Taylor expansion for $f(x)$ around $0$.

It was shown in a previous section that for a function $g:\mathbb{R} \rightarrow \mathbb{R}$ defined by

\begin{equation} g(x) = \begin{cases} e^{-1/x} \, \qquad x>0 \\ 0 \qquad \qquad x\leq0 \end{cases} \end{equation}

can not have a Taylor series at $x=0$ and hence not real analytic even though it is $C^{\infty}$.

The entire set of $\Bbb{R}$ is star-shaped around $x=0$ as all points that lie between $x=0$ and the point itself are in $\Bbb{R}$. So if the Taylor expansion previously derived is valid, a Taylor theorem must exist for $g(x)$ for all points $x$, which we have seen explicitly as not true. I don't know what I am missing.

1

There are 1 best solutions below

0
On

There is no contradiction, the theorem is perfectly valid for $g(x)$ too. But in order to obtain a Taylor series for $g(x)$ which is convergent in some interval $[0,x_0]$, the remainder $g_{i+1}(\xi)x^{i+1}$ (with $0<\xi<x\le x_0$) should tend to zero for $i\to+\infty$, which in this case does not happen.