Taylor Functions for Complex and Real Valued Functions

1.2k Views Asked by At

Some days ago, I asked in which situations we may apply Taylor Series for Real Valued Functions. In the question (Proof Verification and Taylor Series), I wrote a statement about the applicability of the series, however, by counterexample, it was shown to be incorrect. After that, I searched on the internet and also in the recommended books, specially Elon's, about the series.

But ... I'm not sure, it was not clear to me. So, just to be sure, again, let me show in which situations they seem, to me, may be applied.

QUESTION 1: Are the two following propositions correct?

P1 Let $f: D \to \mathbb{R}$ be an analytic (1) real-valued function in its domain $D$ and $x,x_0 \in D$. Then, we may apply the Taylor's Theorem and the series do converge: $$ f(x) = \displaystyle\sum_{n=0}^\infty \left\lbrace \frac{f^{(n)}(x_0)(x-x_0)^n}{n!} \right\rbrace$$

Now, about complex-valued functions (2):

P2 Let $f: D \to \mathbb{C}$ be an analytic complex-valued function in its domain $D$ and $|z-z_0|<R \in D$. Then, we may apply the Taylor's Thereom for Complex Functions and the Series do converge:

$$\cdots$$

Observations:

(1) In some sources, they say "infinitely differentiable" instead of "analytic". Would like to know why since, as answered previously in the another question, it needs to be "analytic". That makes me even more confused....

(2) As far as I know, a real-valued analytic function is an infinitely differentiable one that possess a convergent Taylor Series around its center. But, what about the Complex? Just need to verify Riemann-Cauchy's Theorem?

Thanks in advance

Where I searched:

  1. Elon Lages' Curso de Análise I, Elon;

  2. https://en.wikipedia.org/wiki/Analytic_function;

  3. https://en.wikipedia.org/wiki/Taylor%27s_theorem;

  4. Some other texts out there...

EDIT

QUESTION 2: Okay, from the answers bellow, now I'm sure how to determine if $f$ is complex analytic or not. However, if they are, will the power series converge absolutetly?

2

There are 2 best solutions below

5
On

Your first proposition is not a proposition. At least, not in Elon's book (and in none of the mains sources I know, including Wikipedia) - it's a definition. By definition, an alayitic (real) function is one whose Taylor series converges.

I will quote Elon's definition (my translation - you can check on the section X.4 of the book).

A function $f: I \longrightarrow \Bbb{R}$, defined in an open interval $I$ is called analytic when it's $C^{\infty}$ and, for every $x_0 \in I$ there is $r>0$ such that $x \in (x_0-r, x_0+r)$ implies $x_0 \in I$ and that $$f(x)=f(x_0)+f'(x_0)(x-x_0)+\dots+\dfrac{f^{(n)}(x_0)}{n!}(x-x_0)^n+\dots$$

You see, this is not a theorem. I am not proving anything. I am giving a name to the functions who have a convergent Taylor series, and whose Taylor series converges to the function.

Let's take a look on Taylor's Theorem now. This theorem concerns a broader class of functions. I'll use Wikipedia's version of the Theorem, but you can check Theorems 9 and 10 of Elon's book, for example.

Quoting Wiki:

Let $k ≥ 1$ be an integer and let the function $f : \mathbb{R} → \mathbb{R}$ be $k$ times differentiable at the point $a \in \mathbb{R}$. Then there exists a function $h_k : \mathbb{R} \rightarrow \mathbb{R}$ such that

$$f(x)=f(a)+f'(a)(x-a)+\frac {f''(a)}{2!}(x-a)^{2}+\cdots +\frac {f^{(k)}(a)}{k!}(x-a)^{k}+h_{k}(x)(x-a)^{k},$$ with $\lim_{x \rightarrow a}h_k(x)=0$.

We demand much less of the function on Taylor's theorem - it needs to be only $k$ times differentiable. However, the theorem gives us much less to: we only have an polynomial aprroximation of degree $k$.

The point you seem to be misunderstanding is: we can't cheat and say "well, since for a $k$ times differentiable function we have an approximation by a polynomial of degree $k$, with $k=\infty$ we must have an approximation by an 'infinite-order polynomial'". This is not true. I gave you a counterexample on your previous question, The function $$f(x)=\begin{cases}e^{-1/x^2}, & \text{if } \ x \neq 0; \\ 0,& \text{if } \ x=0\end{cases}.$$

This function is not analytic at $0$. Note that we can apply Taylor's Theorem to $f$ for any value of $f$. In fact, since $f^{(k)}(0)=0$ for every $k$, it suffices to take $h_k(x)=f(x)x^{-k}$ for every $k$.

but Taylor's Theorem applies only for finite values of $k$. That's why we need an special name for function who are not only $C^{\infty}$, but whose Taylor series also converges.

Now, the complex case. You can take a look on Conway's book on Complex Analysis, its a standard source. However, I will use the book I have at home at the moment: Complex Analysis, by Ian Stewart and David Tall.

We define analytic functions in the same way we do for real functions, mutatis mutandis (i.e. we replace the open interval by an open disk, etc). But we are happier in the complex case. If we prove that a function $f$ is once time differentiable in every point of an open disk, then it's analytic on this disk. That is really nice :) There is no simple way to characterize real analytic funtions like this one.

in other words,

Let $D$ be an open disk containing $z_0$. If $f:D \rightarrow \Bbb{C}$ is differentiable in $D$, then, for every $z \in D$, $$f(z)=\sum_{n=0}^{\infty} \dfrac{f^{(n)}}{n!}(z-z_0)^n.$$

That's why some authors say that a complex function is analytic if it is differentiable in an open set - that's all you need to guarantee the existence and convergence of its Taylor series.

Why is this, though? The point is, complex-differentiability is way stronger than real-differentiability.

1
On

P1 is incorrect (here $D$ a real interval finite or infinite and even if $|x-x_0| < $distance $(x_0, \partial D)$) but P2 is correct if we assume $|z-z_0| < $ distance $(z_0, \partial D)$ (here $D$ is a complex open connected set) and that shows one of the big differences between real-analytic and complex-analytic.

First to clear up a little confusion - a real analytic function $f$ (on some interval $I$) is one that satisfies two conditions:

1: f is infinitely differentiable ($f \in C^{\infty}(I)$)

2: for any $x_0 \in I$ there is a small (relatively open) interval $x_0 \in I_{x_0} \subset I$ for which the Taylor series of $f$ centered at $x_0$ converges to $f(x)$ for all $x \in I_{x_0}$ (we may allow $I$ to be closed at one or both ends, $x_0$ to be an end etc)

Here things may break two ways - the Taylor series may not converge for some $x_0$ at any point near it (its radius of convergence is $0$ or if you wish $|f^{(n)}(x_0)/n!|$ has a large subsequence for which $|f^{(n_k)}(x_0)/n_k!|^{1/n_k} \to \infty$, or the Taylor series may converge near $x_0$ (and even on the whole $I$ or even the whole real axis) but not to $f$ and actually we may have convergence to $f$ on one side of $x_0$ but not on the other (for this last type of breakdown $x_0$ is assumed to be interior to $I$) - $C^{\infty}$ functions with compact support in some interval $[a,b]$ present this breakdown at the ends of the interval where they have zero Taylor series so it definitely converges to the respective end outside $[a,b]$ but not inside assuming they are not identically zero near $a$ or $b$;

Even if the function is real analytic on $I$ the Taylor series at any point may not converge for all $x$ for which $|x-x_0| < $ distance $(x_0, \partial D)$ (a simple example is $1/(1+x^2)$ which is real analytic on the whole real line but the Taylor series at zero has radius of convergence $1$)

Many weird things can happen - for example, one can have $C^{\infty}$ functions that are not analytic at any point on an interval $[a,b]$ (and perforce their Taylor series must diverge on an everywhere dense $G \subset [a,b]$, while conversely there are $C^{\infty}$ functions with convergent Taylor series at every point of $[a,b]$ (in other words the radius of convergence $\rho(x) >0$ for all $x \in [a,b]$) but which fail to be analytic at an arbitrary nowhere dense closed set (which can be quite big if we think of Cantor like sets); a non-trivial theorem of Pringsheim-Boas shows that if $\rho(x)>\delta >0$ for all $x$ in $I$, $f$ is analytic and various results (Bernstein, Boas, Schaeffer) give sufficient conditions for $f$ to be analytic in terms of the zeroes of all its derivatives (the most general such result is that if all derivatives of $f$ have no more than a fixed amount $p$ of zeroes (each being allowed $p$ or less zeroes of course on $I$), then $f$ analytic

2, A complex function on some domain (open connected) $D \subset \mathbb C$ is analytic if and only if it is complex differentiable on $D$ (so complex differentiability once on an open set implies infinite differentiability and analyticity too; then the Taylor series of $f$ at $z_0$ converges for all $z$ for which $|z-z_0| < $ distance $(z_0, \partial D)$

None of the above weird things from the real analytic world happens in the complex analytic world