Is there a proof that all analytic functions only have one unique Taylor series representation?

2.2k Views Asked by At

I know that a function can admitted multiple series representation (according to Eugene Catalan), but I wonder if there is a proof for the fact that each analytic function has only one unique Taylor series representation. I know that Taylor series are defined by derivatives of increasing order. A function has one and only one unique derivative. So can this fact be employed to prove that each function only has one Taylor series representation?

4

There are 4 best solutions below

20
On

I think this simple proof is sufficient. I'm going to do it in two cases, but really the first case is a special case of the second.

Suppose a function $f(x)$ has two taylor series representations.

$$f(x)=\sum a_n x^n$$

$$f(x) = \sum b_n x^n$$

we know that $f(x) - f(x) = 0$, so just plug in each of the representations

$$f(x) - f(x) = \sum b_n x^n - \sum a_n x^n = 0$$

$$\sum (b_n-a_n) x^n = 0$$

The only way we can get 0 is if the coefficients are separately equal, since there is no cancellation, in general for all x, for monomials of different degree.

$$b_n-a_n = 0 $$ $$b_n =a_n $$

Now suppose we center the series at different points for each representation, i.e.

$$f(x)=\sum a_n (x-a)^n$$

$$f(x) = \sum b_n (x-b)^n$$

The binomial theorem is helpful here

$$f(x)=\sum a_n (x-a)^n = \sum a_n\sum\binom{n}{k}a^{n-k}x^k =\sum a'_kx^k $$

so $a'_k$ is just a new constant. The same will happen with the other representation, just set $a$ to $b$, and you will get again that

$$b'_k =a'_k$$

So the Taylor series representation is unique.

0
On

You can prove that a power series is differentiable on the interior of interval of convergence, with derivative is obtained by differentiating term by term. So, you can conclude that the coefficient of $x^n$ must be $\frac{f^{(n)}(0)}{n!}$. So, coefficients are determined uniquely. So, the Taylor series is unique.

0
On

Well its possible for e.g. $f(x) = \sum a_n x^n = \sum b_n (x-1)^n$ simultaneously, but that probably isn't what you meant. Instead lets just consider the behavior at one point, say expanding around $x=0$.

Let's fix notation-

A "power series" (at $x=0$) is any series formally defined by $\sum_{n=0}^\infty a_n x^n$. A "Taylor series" (at $x=0$) for a smooth (i.e. $C^\infty$) function $f$ is the power series formally defined by $\sum_{n=0}^\infty \frac{f^{(n)}(0)}{n!} x^n$.

So any function that is infinitely differentiable (at $x=0$) has a unique Taylor series at 0 [note that the Taylor series may not converge, and if it converges, it may not converge to $f$]. But I think you are trying to ask if any "analytic function" (a term I haven't defined yet) is equal at each point to a unique power series, which is the Taylor series. You can first prove the following result, which allows you to define the concept of "analytic functions"-

Theorem 1. Any power series $\sum_{n=0}^\infty a_n x^n $ that converges at one $x_0$ where $|x_0|=\rho>0$, converges absolutely and locally uniformly on the set $|x|<\rho $, where it defines a $C^\infty$ function $F(x) := \sum_{n=0}^\infty a_n x^n$, and $ a_n = \frac{F^{(n)}(0) }{n!}$.

In particular, the power series is the Taylor series of $F$. An "analytic function" (near $x=0$) is defined to be any such function $F$ that can be obtained in this way (i.e. an analytic function is a $C^\infty$ function locally equal to a convergent power series, its Taylor series.)

Suppose now that we have $\lim_{N\to\infty}\sum_{n=0}^N a_n x^n = 0$ for $|x|<r$. Then I claim that $a_n = 0$ for all $n$, proving the uniqueness of convergent power series for $f(x) = 0$. This immediately follows from Theorem 1 above, which allows us to talk of the function $F(x) := \sum_{n=0}^\infty a_n x^n$. But by hypothesis, $F$ is actually the zero function, so we have $a_n = \frac{F^{(n)}(0) }{n!} = 0$.

This implies the uniqueness of convergent power series (at $0$) for any analytic function; for if there were two different ones, their difference would be a nonzero convergent power series equal to 0, which doesn't exist.

I'll sketch the proof of the main result (Theorem 1). We have convergence at $x=x_0$ where $|x_0|=\rho$. Let $0<r<\rho$. Then note that we have (from $\sum_{n=0}^\infty d_n $ exists implies $ d_n \to 0$) $$a_n x_0^n \xrightarrow[n\to\infty]{} 0 \implies |a_n| |x_0|^n = |a_n|\rho^n \xrightarrow[n\to\infty]{} 0.$$ In particular there exists $M>0$ such that $|a_n| \rho^n < M$ for all $n$. Therefore for any $x$ such that $|x|\le r$, by Geometric Series formula, since $\left(\frac r{\rho} \right)<1$, $$ |a_n x|^n \le |a_n | r^n = |a_n | \rho^n \left(\frac r{\rho} \right)^n \le M \left(\frac r{\rho} \right)^n, \quad\sum_{n=0}^\infty M \left(\frac r{\rho} \right)^n < \infty. $$ So by the Weierstrass M-test, in fact the series converges absolutely and uniformly (and therefore pointwise) on the closed disk $|x|\le r$. It therefore defines a function, which we call $F(x)$.

If the series can be differentiated term-by-term, then a standard induction argument proves that $a_n = F^{(n)}(0)/n!$. Formally differentiating once, we formally obtain the series $\sum_{n=1}^\infty n a_n x^{n-1} = \sum_{n=0}^\infty (n+1) a_{n+1} x^n$. Now note that for $|x|\le r<\rho$, $$ |(n+1) |a_{n+1}| x^{n}| \le (n+1) |a_{n+1}| r^{n} \le (n+1) M \left(\frac{r}{\rho}\right)^n \le CM \left(\sqrt{\frac{r}{\rho}}\right)^{n}, \\ \sum_{n=0}^\infty CM \left(\sqrt{\frac{r}{\rho}}\right)^{n} < \infty$$ since there exists $C>0$ such that $n+1 < C \left(\frac{\rho}r\right)^{n/2}$ for all $n$. By Weierstrass M-test, the formal series obtained by term-by-term differentiation converges absolutely and uniformly to some function $G$ on $|x|\le r$, which implies that $F$ is differentiable with $F'=G$. This argument is repeatable (using instead $n^k < C_k \left(\frac{\rho}r\right)^{n/2}$), proving by induction that $F$ is $C^\infty$, and validating the result $ a_n = F^{(n)}(0)/n!$ .

0
On

The Taylor series is indeed uniquely defined for any smooth function, regardless whether it is convergent or not and whether it coincides with the function when convergent. And so asking about uniqueness is a bit pointless. It's like asking about uniqueness of derivative. However the question can be turned into a sensible one, if we ask whether $f(x)$ can be represented as a power series uniquely, i.e. if $\sum a_n(x-x_0)^n$ and $\sum b_n(x-x_0)^n$ are both convergent and equal over some open interval, then does it follow that $a_n=b_n$ for any $n$?

This can be reduced (by subtracting) to the question that if $\sum c_n(x-x_0)^n=0$ over some open interval then does $c_n=0$ follow?

Now assume that $\sum c_n(x-x_0)^n=0$ over some open interval $(a, b)$ with $x_0\in (a,b)$. Since every power series evaluates to $c_0$ at $x=x_0$ then we conclude that $c_0=0$. Thus we can write our equation as

$$(x-x_0)\cdot\big(c_1+c_2(x-x_0)+c_3(x-x_0)^2+\cdots+c_n(x-x_0)^{n-1}+\cdots\big)=0$$

It is tempting to multiply both sides by $(x-x_0)^{-1}$ and conclude that $c_1=0$ (and so by induction $c_n=0$) but we cannot do that for $x=x_0$. And actually we are only interested in $x=x_0$ case. Nevertheless we can do that for $x\neq x_0$. And so we conclude that

$$c_1+c_2(x-x_0)+c_3(x-x_0)^2+\cdots+c_n(x-x_0)^{n-1}+\cdots=0$$

for any $x\in (a,b)\backslash\{x_0\}$. Of course every power series is convergent at $x=x_0$, the question is whether it is $0$ there? And it is, because every power series is continuous (as a function of $x$) wherever it is convergent (see this). This implies that $c_1+c_2(x-x_0)+c_3(x-x_0)^2+\cdots=0$ for $x=x_0$ as well. And therefore $c_1=0$ by evaluating at $x=0$.

Now we repeat this process and by simple induction we conclude that $c_n=0$ for any $n\in\mathbb{N}$.