Radius of convergence of Taylor series of a function of real variable

140 Views Asked by At

We know that if we have $f(z)=1/(1+z^2)$ then the radius of convergence of Taylor series associated with $f(z)$ is the distance of nearest singularity of $f(z)$. But if instead of $f(z)$ we have given $f(x) =1/(1+x^2)$ and asked where does the Taylor series of $f(x)$ about converges? As $f(x)$ is infinitely many times differentiable at each real number then can I say that Taylor series of $f(x)$ converges for all $\mathbb{R}$?

1

There are 1 best solutions below

2
On BEST ANSWER

Taking: "The radius of convergence of a Taylor series associated with $f(z)$ is the distance to the nearest singularity of $f(z)$", where the distance is measured from the center of the taylor series; we see the singularities of $f(z) = \frac{1}{1 + z^2}$ to be $i$ and $-i$. Thus for any $x \in \mathbb{R} \subset \mathbb{C}$, we have $\texttt{min}(|x - i|, |x -(-i)|) \geq 1 > 0$ and thus $f|_{\mathbb{R}}: \mathbb{R} \rightarrow \mathbb{R}$ is analytic over $\mathbb{R}$, i.e. a Taylor series centered at point $x_0 \in \mathbb{R}$ with positive radius of convergence exists for each $x_0 \in \mathbb{R}.$