Determining, without recourse to complex analysis, which functions $f: \mathbf{R} \to \mathbf{R}$ converge to their Taylor series

241 Views Asked by At

One conceptual stumbling block for me—and for many others, I'm sure— in introductory calculus was that it was never explained how we could be sure that a function $f: \mathbf{R} \to \mathbf{R}$ actually converged to its power series—it seemed implausible that a function's derivatives at one point could determine it completely, even for a short interval. Of course, this isn't true; there are well-known examples of functions with distinct values everywhere but one point $x \in \mathbf{R}$ but with identical convergent Taylor series about $x$—for example, $$ \begin{align*} f(x) &= 0 \\ g(x) &= \begin{cases} e^{-1/x^2} & x \neq 0 \\ 0 & x=0, \end{cases} \end{align*}$$ both of which have zero derivatives of all orders at $x=0$. We can explain why $g$ never equals its Taylor series with complex analysis: the complex function $g: \mathbf{C} \to \mathbf{C}$ given by $g(z) = e^{-1/z^2}$ has a singularity at the origin, going to $0$ if approached along the real axis but to $+\infty$ if approached along the imaginary axis, and thus doesn't have complex derivatives at the origin at all. But this requires students to understand the notions of holomorphic and analytic functions, and some basic results of complex analysis such as the Cauchy integral formula. Without using complex analysis, can we prove whether any function of one real variable, besides polynomial functions, converges to its Taylor series? At the very least, is it possible to prove this of the more commonly seen functions in basic calculus—ratios of polynomials, sines, cosines, logarithms, and exponents?

2

There are 2 best solutions below

0
On BEST ANSWER

You're talking about the collection of all real analytic functions. To check that the Taylor series (centered at a particular point) converges to the correct function on the interval of convergence requires considering the remainder term $R_n$ for the $n$th degree Taylor polynomial and showing that this converges to $0$ as $n\to\infty$ for all $x$ in the interval. [It's also not too hard to show by analyzing these remainder estimates that functions "built" out of real analytic functions — by differentiating, integrating, adding, multiplying, dividing, etc. — are again real analytic on the appropriate domain.]

For the function $g$ you brought up, no complex analysis whatsoever is needed. You just need to argue (by an induction argument, knowing that exponentials go to $\infty$ faster than any polynomial at infinity) that all the derivatives of $g$ at $0$ are in fact $0$. So in this case $R_n(x)=g(x)$ for all $n$, and the criterion fails.

2
On

Trigonometric, hyperbolic and exponential functions are analytic since they're defined as sums of power series. The same is true for rational functions, since the quotient of two analytic functions is analytic on any open set of its domain.