Resource for the Proof of Root Test of Absolute Convergence

318 Views Asked by At

I was in lecture a couple of days ago, and I found the Root test for Absolute Convergence on my studying over winter break. Basically it went something like this:

Absolute Convergence

$$\sum_\limits{n=0}^\infty a_n \mathrm{\ converges\ absolutely \ iff \ \sum_\limits{n=0}^\infty}\rvert a_n\lvert \ \mathrm{converges.} $$ One way to prove this is using the Root Test for absolute convergence:

Root Test for Absolute Convergence

$$\lim_\limits{n\to\infty}(\lvert a_n\rvert)^\frac{1}n=L$$ If: \begin{align} L&\lt 1\ \mathrm{the\ series \ converges} \\ L&\gt 1\ \mathrm{the\ series\ diverges} \\ L&= 1\ \ \mathrm{the\ root\ test\ is\ inconclusive} \end{align}

My Question

I was wondering my professor said that the test is only applicable for circumstances when $a_n$ is positive terms. I was wondering if this was true? Or if it worked with negative terms, or alternating terms? Is there a formal proof on the Root Test for Absolute Convergence?

2

There are 2 best solutions below

0
On BEST ANSWER

The root test can be applied to any series, but since it has to do with the sequence $\left(\sqrt[n]{\lvert a_n\rvert}\right)_{n\in\mathbb N}$, if it works, then it will necessarily tell you that the series not only converges, but that it converges absolutely.

If $\lim_{n\to\infty}\sqrt[n]{\lvert a_n\rvert}=L<1$, take a number $r\in(L,1)$. Then, for some natural $N$,$$n\geqslant N\implies\sqrt[n]{\lvert a_n\rvert}<r\iff\lvert a_n\rvert<r^n.$$Since the series $\sum_{n=N}^\infty r^n$ converges, $\sum_{n=N}^\infty a_n$ converges absolutely and therefore $\sum_{n=1}^\infty a_n$ converges absolutely.

If $\lim_{n\to\infty}\sqrt[n]{\lvert a_n\rvert}=L>1$, then, $\lvert a_n\rvert>1$ if $n$ is large enough and therefore you don0t have $\lim_{n\to\infty}a_n=0$. So, the series $\sum_{n=1}^\infty a_n$ diverges.

Finally, the root test is inconclusive if $\lim_{n\to\infty}\sqrt[n]{\lvert a_n\rvert}=1$ because the series $\sum_{n=1}^\infty1$ diverges and the series $\sum_{n=1}^\infty\frac1{n^2}$ converges.

0
On

If the sequence $\{a_n\}_{n\in \mathbb{N}}$ is non-negative, then it is obvious since $a_n=|a_n|$. However if we take $$a_n=\frac{(-1)^{n+1 }}{n}$$ it is a known result that $$\sum_{n=1}^{\infty}\frac{(-1)^{n+1 }}{n}=\text{ln}(2).$$ However if we take the absolute value of the sequence elements we get $$\sum_{n=1}^{\infty}|a_n|=\sum_{n=1}^{\infty}\frac1n=\infty,$$ which mean the statement does not hold true for series that contain negative sequence elements. The above example is a convergent series, but divergent when you take the absolute value of the summands. Such a series is called conditionally convergent. For your initial result it holds more generally that absolute convergence implies convergence, with the above example being a counterexample to the equivalence.

You can use the root-test to check whether a series converges absolutely by the arguments from Mr. Santos.