Suppose we have some function $f$ that is $C^\infty$ on some interval $J \subseteq \mathbb R$. We choose a point $x_0$ and compute the Taylor series $\sum a_n (x-x_0)^n$. Next, we calculate the radius of convergence $r$, so we know the series converges (absolutely and uniformly) to some function $g$ on $(x_0 - r, x_0 + r)$. Finally, we compute the Taylor remainder $R_n(x)$ and show that it converges to zero within $(x_0 - r, x_0 + r)$. We conclude that $f = g$ on $(x_0 - r, x_0 + r)$.
So far, so good. Now we turn to consider endpoint behavior, and we verify that our Taylor series converges at one of our endpoints -- say, at $x_0 + r$. (That is, we confirm that the series $\sum a_n r^n$ converges.) Are we justified in concluding that $\sum a_n r^n = f(x_0 + r)$?
That is:
If a Taylor series converges to $f$ inside an open interval, and it converges (to something) at one of the endpoints of the interval, does it necessarily converge to $f$ at the endpoint?
Or to put it another way:
Can a Taylor series converge to $f$ inside an open interval but converge to something else at one (or both) of the interval's endpoints?
My instinct is that if the series converges at an endpoint, it must agree with $f$ there; I feel that there is an elementary argument, relying solely on the facts that
- The original function $f$ is continuous at $x_0 + r$
- The limit of the series is continuous inside $(x_0 - r, x_0 + r)$
- The two functions agree inside $(x_0 - r, x_0 + r)$
...but for some reason I can't see how to connect the dots to reach the conclusion that the two functions agree at $x_0 + r$.