Previous questions on this site have addressed smooth non-analytic functions, e.i. functions that have infinite derivatives but do not equal it’s Taylor series. An example is $e^{-1/t}$ for $t>0$ and $0$ for $t \leq 0$.
My question is regarding how this relates to Taylor’s Theorem which says that for some function $f$, with $n$ derivatives on a closed interval and $t_0$ in that closed interval, there is a $\zeta$ such that $$f(t_0)= \sum_{k=0}^{N}{\frac{f^{(k)}(c)(t_0- c)^k}{k!}}+\frac{f^{(N+1)}(\zeta)(t_0 - c)^{(N+1)}}{(N+1)!}.$$ This is a Taylor polynomial plus an error term. It seems like if the error term tends towards $0$ as $N$ grows towards infinity, then they infinite Taylor series will converge. Why does it not follow that if it converges, it must converge to the correct value of $f(t)$? In the above example, the Taylor series exists and converges but to the wrong value of $f(t)$ at $t=0$.
First usually one uses the function $e^{\frac{1}{-x^2}}$ and the Taylorseries for c=0 for that case you find all arguments in Under what conditions does $f$ equal its Taylor series about $a$ on the closed interval $[a,b]$.