Does absolute convergence of $\sum_{n=1}^{\infty} a_n$ implies $(a_n)^{1/n}$ tends to some $r \in [0.1]$?

57 Views Asked by At

True or false: Let $\sum_{n=1}^{\infty} a_n$ be an absolutely convergent series, then $(a_n)^{1/n} \rightarrow r \in [0.1]$?

My initial progress: evidently $|a_n|\to 0$,so eventually $|a_n| <1$ and therefore $|a_n|^{1/n}<1$. I tried to use proof of contradiction at this point to show that if $|a_n|^{1/n}$ doesn't converge to anywhere in $[0,1]$ something will go wrong, but I didn't succeed...

This is essentially the converse of the root test. Therefore I went to several analysis book and look up for a potential counter-example, but I didn't find any relevant discussion.

If the statement in the question is true, supply a detailed proof. Otherwise, provide a counter-example.Thank you!

2

There are 2 best solutions below

1
On BEST ANSWER

The thing here is that you can have a subsequence for which the root test works and an other one for which the root test fails (the limit of $(a_n)^{1/n}$ along this subsequence is $1$) but the series is still convergent.

This leads to the counter-example $a_{2n}=2^{-2n}$ and $a_{2n+1}=n^{-2}$.

0
On

If $\sum a_n$ is absolutely convergent, the best you can say is (by the Cauchy–Hadamard theorem) $$\limsup_{n \to \infty} \left( |a_n|^{1/n} \right)=\frac1R\le1$$ but $ \left( |a_n|^{1/n} \right)$ is not necessarily convergent.

An easy counterexample is $$a_{2n}=\frac1{n^2},\quad a_{2n+1}=0.$$