It seems to me that there is a contradiction between the Taylor's theorem and the properties of the Taylor series of the function $f(x)=e^{-1/x^2}$ if $x\ne0$ and $f(x)=0$ if $x=0$.
On the one side, with the above extension $f$ is $C^\infty$ at $x=0$ and the Taylor series $TS[f]_0(x)$ of $f(x)$ at $x=0$ is identically null with a radius of convergence $R_c=\infty$, whereas $f(x)\ne 0$ for $x \ne 0$. As a consequence, $TS[f]_0(x)\ne f(x)$, $\forall x\ne0$.
On the other side, Taylor's theorem states that an expansion at the order $n$ gives: $f(x)=f(0)+\frac{f'(0)}{1!}x+\frac{f''(0)}{2!}x^2+...+\frac{f^{(n)}}{n!}x^n+h_n(x)x^n, \qquad$ with $\lim_{x\rightarrow0}h_n(x)=0$
So this means that we should get better approximations of $f(x)$ when $n$ increases. Which is then not the case in this particular example as increasing $n$ doesn't make a difference in the polynomial expansion.
Now, what is the problem here? Is it that we cannot apply Taylor's theorem in this particular case, and if so, why? Or is it that it is wrong to express the rest as $h_n(x)x^n$ with $\lim_{x\rightarrow0}h_n(x)=0$?
Alternatively, what special properties could have the functions $h_n(x)$ that doesn't make $h_n(x)x^n$ become smaller when $n$ increases?
Or am I missing something else?
You seem to be confused about what the theorem you cite says.
Take $h_n(x) = f(x) x^{-n}= \begin{cases} 0 &\text{ if} x=0 \\ \frac{e^{-\frac{1}{x^2}}}{x^n} &\text{ otherwise.} \end{cases}$
It does satisfy $\lim_{x\to 0} h_n(x) = 0$, and all the properties promised by the theorem. There is no contradiction.
But it is utterly useless in order to approximate $f$ more and more finely by a sequence of polynomials forming the partial sums of a power series, "as $n$ grows." The whole point is exactly that: $f$ cannot be well approximated by such a family of polynomials around $0$. Trying to do so, no matter what degree $n$ you choose, you get a zero polynomial, and the remainder/error is... well, $f$.