Limiting behavior of a function defined by a Lambert-type series

85 Views Asked by At

Fix a positive real constant $\omega$, and let $\left\{ c_{n}\right\} _{n\geq1}$ be a sequence of real numbers so that the series: $$\sum_{n=1}^{\infty}\frac{c_{n}}{n^{\omega}}$$ converges conditionally to a finite value $S$. Next, let: $$L\overset{\textrm{def}}{=}\lim_{x\uparrow1}\left(1-x\right)^{\omega}\sum_{n=1}^{\infty}c_{n}\frac{1-\left(1-x^{n}\right)^{\omega}}{\left(1-x^{n}\right)^{\omega}}$$ Is it true that $L=S$? That is, is the interchange of limit and sum:$$\lim_{x\uparrow1}\left(1-x\right)^{\omega}\sum_{n=1}^{\infty}c_{n}\frac{1-\left(1-x^{n}\right)^{\omega}}{\left(1-x^{n}\right)^{\omega}}\overset{?}{=}\sum_{n=1}^{\infty}c_{n}\lim_{x\uparrow1}\left(1-x\right)^{\omega}\frac{1-\left(1-x^{n}\right)^{\omega}}{\left(1-x^{n}\right)^{\omega}}=\sum_{n=1}^{\infty}\frac{c_{n}}{n^{\omega}}$$ valid? I already know that this is true whenever the sum defining S is absolutely convergent. I've tried using a summation-by-parts argument to deal with the conditionally convergent case, which gives:

$$L-S=\lim_{x\uparrow1}\left(1-x\right)^{\omega}\sum_{n=1}^{\infty}\left(\left(\frac{n+1}{1-x^{n+1}}\right)^{\omega}-\left(\frac{n}{1-x^{n}}\right)^{\omega}+n^{\omega}-\left(n+1\right)^{\omega}\right)\sum_{k=n+1}^{\infty}\frac{c_{k}}{k^{\omega}}$$ and I suspect that Dirichlet's test for series convergence—namely, that: $$\sum_{n=1}^{\infty}a_{n}b_{n}$$ converges whenever the $b_{n}$s are a non-increasing sequence of real numbers that tend to $0$ as $n\rightarrow\infty$, and whenever the $a_{n}$s are complex numbers for which: $$\sup_{N\geq1}\left|\sum_{n=1}^{N}a_{n}\right|<\infty$$ but I can't quite see how to make the argument work.

Anyhow, how might I show that $L=S$ (using Dirichlet's test or some other method), or, is there a counterexample that will dash my hopes to pieces? If there is a counterexample, might there be some minor stipulation to be made for the $c_n$s so as to guarantee $L=S$?

1

There are 1 best solutions below

2
On BEST ANSWER

Yes, it holds true. There's a known criterion of "regularity of Abelian summation":

Let $f_n:(a,b)\to\mathbb{R}$ meet $\lim\limits_{x\to b}f_n(x)=1$ for each $n$. Then, to have $$\sum_{n=1}^\infty a_n=\lim_{x\to b}\sum_{n=1}^\infty a_n f_n(x)$$ for any convergent $\sum_{n=1}^\infty a_n$, it is necessary and sufficient to have $$\sup_{x\in(c,b)}\sum_{n=1}^\infty\big|f_n(x)-f_{n+1}(x)\big|<\infty\tag{*}\label{maincond}$$ for some $c\in(a,b)$.

A proof is in section $4.7$ of Divergent Series by G.H.Hardy. We only need the sufficiency (which is fairly easy to prove, basically following your idea of summation by parts), but the necessity implies that we have the only way to go. So we check \eqref{maincond} with $(c,b)=(0,1)$ and $$f_n(x)=\big(n(1-x)\big)^\omega\big((1-x^n)^{-\omega}-1\big).$$ It turns out that $\sup\limits_{x\in(0,1)}=\lim\limits_{x\,\uparrow\,1}$ here, and a computation of this value (call it $S_\omega$) is sketched below.

For a fixed $x\in(0,1)$, the map $n\mapsto f_n(x)$ is monotonic if $\omega\leqslant 1$ and unimodal if $\omega>1$.
This implies $\color{blue}{S_\omega=2L_\omega-1}$, where $L_\omega=\sup_{x\in(0,1)}\sup_n f_n(x)$, and $L_\omega=1$ if $\omega\leqslant 1$.

To show the above, in rough steps, we set $g_\omega(x)=(-\log x)^\omega\big((1-x)^{-\omega}-1\big)$ so that $$f_\color{blue}{n}(x)=\left(\frac{1-x}{-\log x}\right)^\omega g_\omega(x^{\color{blue}{n}}).$$ Now if $\omega>1$ then $g_\omega'(x)=0\iff 1-(1-x)^\omega=(-x\log x)/(1-x)$ has a unique solution $x=x_\omega\in(0,1)$ [consider the "inverse" function $\omega=\omega(x)$], giving eventually $\color{blue}{L_\omega=g_\omega(x_\omega)}$.

And if $\omega\leqslant 1$ then $g_\omega'(x)>0$ for $x\in(0,1)$ [shown similarly], giving the expected $\color{blue}{L_\omega=1}$.