Why we ONLY use ratio test and not conditional convergence to determine the interval of convergence of an alternating series?

659 Views Asked by At

For example, consider $$S_n=\sum_{n=1}^{\infty} \frac{(-1)^n x^n} {\sqrt{n}}$$ While determining the interval of convergence, we use the ratio test to determine the interval in which the series converges absolutely. From the ratio test we get the IoC to be $$ -1 \lt x \le 1 $$

Why don't we use the Leibniz Test for an alternating series to find the IoC? Wouldn't it give a larger interval of convergence?

There is no specific mention of this anywhere on the internet I searched. Is there a particular reason we do this?

2

There are 2 best solutions below

4
On BEST ANSWER

You are wrong in your assumption that from the ratio of convergence you deduce that the series converge if and only if $-1<x\leqslant1$. The ratio test is inconclusive when $x=1$ and that's why you need the Leibniz test to determine whether the series converges if $x=1$.

1
On

The ratio test is a test for absolute convergence. It will only give you the size of the interval of convergence, but doesn't tell you anything about the endpoints. To determine behavior at the endpoints, you must use some other test that can distinguish between conditional convergence and absolute convergence, such as the Leibniz test.