Why is the ratio test for $L=1$ inconclusive?

8k Views Asked by At

One of the often used tests for convergence ($L\lt 1$) and divergence ($L\gt 1$) of an infinite series is the ratio test.

The idea behind it, why it works is the geometric series which dominates (or not) the tested series.

My question:
With the idea in mind that the geometric series dominates (or not) the tested one, it is a mystery to me why the test is inconclusive for the case $L=1$, because the geometric series clearly diverges in the case $x\geq 1$.

I see that there are examples for cases where $L=1$ that are convergent yet, I don't get why. I have no understanding and no intuition for that case.

Could anybody help? Thank you!

1

There are 1 best solutions below

0
On BEST ANSWER

Because the ratio test doesn't involve comparison with a geometric series of ratio $L$, but rather one with ratio close to $L$.

If your series has $L = \lim_{n\rightarrow\infty}\left|\frac{a_{n+1}}{a_n}\right| < 1$, then for any $\epsilon >0$, $\left|\frac{a_{n+1}}{a_n}\right|$ is eventually less than $L+\epsilon$, so your series is eventually dominated by a geometric series of ratio $L+\epsilon$. If you take $\epsilon$ small enough that $L+\epsilon<1$, then this geometric series will converge, and so your original series converges.

Similarly, if $L>1$, then your series eventually dominates a geometric series of ratio $L-\epsilon > 1$, and so diverges.

But if $L=1$, neither of these approaches work. For any $\epsilon > 0$, $1+\epsilon > 1$, so the geometric series that you can show eventually dominate your series are all divergent. Similarly, $1-\epsilon<1$, so the geometric series you can show are eventually dominated by your series are all convergent.