Having some trouble using Dirichlet's test to show series convergence,

399 Views Asked by At

The series is $$\sum_{k=1}^{\infty} \frac{1}{k^\alpha}\log\left(1+\frac{1}{k}\right)$$

Since the summand is a product of two factors, and the log factor is monotonically decreasing to zero, as n goes off to infinity, I want to use Dirichlet's test to show the convergence of this series, depending on the parameter of $\alpha$, which is the tricky part.

The second criteria to satisfy in order to be able to apply the test is for

$$\sum_{k=1}^N \frac{1}{k^\alpha}$$

to be bounded for every positive integer $N$. For $\alpha >1$, this is obvious, since it gives a convergent $p$-series.

So, I had thought that my answer was that this series converges for all $\alpha>1$.

Apparently not.

The complete interval of convergence is actually for $\alpha>0$.

Can I stay on this track and still use the Dirichlet test? If so, how can I tweak the above partial sum to show that it is also bounded for $\alpha>0$, for all positive integers $N$?

Thanks

1

There are 1 best solutions below

5
On BEST ANSWER

Hint: To show that we have convergence for $\alpha\gt 0$, use the inequality $\log(1+x)\lt x$ if $x\gt 0$.

To show we do not have convergence for $\alpha\le 0$, show first that we do not have convergence at $\alpha=0$. This can be done in various ways. One way is to use an estimate of $\log(1+1/k)$,

Another is to note that $\log(1+1/k)=\log(k+1)-\log k$ and then use telescoping.