Power series differentiable at endpoints?

323 Views Asked by At

The usual theorem is something like this (please correct me if this is wrong):

Let $R>0$ ($R$ can be $+\infty$).

Suppose $c_0+c_1x+c_2x^2+c_3x^3+\dots$ converges on $(-R,R)$.

Define $f:(-R,R)\rightarrow \mathbb R$ by $f(x)=c_0+c_1x+c_2x^2+c_3x^3+\dots$.

Then $f$ is differentiable on $(-R,R)$, with $f'(x)=c_1+2c_2x+3c_3x^2+\dots$.

Here's my question:

Now assume $R\neq+\infty$. In the above theorem, replace each instance of $(-R,R)$ with

(A) $(-R,R]$;

(B) $[-R,R)$; or

(C) $[-R,R]$.

to obtain new Theorems A, B, and C. Is each of these new Theorems true?

If not, how can we strengthen the assumptions so that each new Theorem is true?

(Added note: $R$ need not be the radius of convergence.)

3

There are 3 best solutions below

3
On

You cannot say anything about the convergence at $\pm R$. For example take $R=1$ and consider $ \sum \frac { {x^{n}+(-x)^{n}}} n$.

To say anything about continuity, differentiability etc at $\pm R$ you have to assume that the radius of convergence exceeds $1$.

2
On

If $\sum_{n=0}^{\infty}a_nx^n$ converges on $[-R,R],$ does $\sum_{n=1}^{\infty}na_nx^{n-1}$ converge at either $-R$ or $R?$

The answer is no. Consider the power series

$$\sum_{n=0}^{\infty}\frac{x^{2n}}{(n+1)^2}.$$

This series converges for each $x\in [-1,1].$ The derivative series is

$$\sum_{n=1}^{\infty}2n\frac{x^{2n-1}}{(n+1)^2}.$$

At $x=1$ the derivative series is $\sum_{n=1}^{\infty}2n\dfrac{1}{(n+1)^2}=\infty.$ At $x=-1,$ the derivative series is $\sum_{n=1}^{\infty}2n\dfrac{-1}{(n+1)^2}=-\infty.$

0
On

If a power series has $0 < R < \infty$ as its radius of convergence, first we notice that all the (formal) derivatives and integrals (antiderivatives) of the power series have the same radius of convergence (that is, of course, true if $R=0$ or $R=\infty$).

Then either endpoint ($\pm R$) convergence implies continuity and uniform convergence there by Abel's theorem (summation by parts - usually the result is stated for $R=1$ but a simple change of variables proves it for arbitrary $0<R<\infty$), so in particular we can integrate term by term and obtain endpoint convergence for all the integrals of the given power series, but we cannot say anything about the derivatives (see examples in the other answers); on the other hand if we have end point divergence, by the previous statement, the derivatives of the power series must diverge at the respective end point too.

If we have growth conditions on the coefficients, we may be able to deduce things (for example let $R=1$ for simplicity, convergence of the power series at $1$ requires $a_n \to 0$ though that is of course not sufficient, but the convergence of the derivative at $1$ requires $na_n \to 0$, while for the second derivative we need $n^2a_n \to 0$ etc and all those are just necessary conditions, while if we have something like $n^{2+\epsilon}a_n$ bounded for some $\epsilon >0$, we can immediately deduce that the derivative converges absolutely at $\pm 1$ and hence that the power series $\sum {na_nx^{n-1}}$ converges to $(\sum {a_nx^n})'$ uniformly and absolutely on $[-1,1]$ ).

The interstice between the necessary and sufficient coefficient conditions (so for the derivative of a power series this would be the gap between: $na_n \to 0$ vs $n^{2+\epsilon}a_n$ bounded) is part of Tauberian theory and along the years various results were obtained to bridge this gap at least in cases of sufficient regularity of the coefficients, so there is no definite answer to the second question.