Theorem (Leibniz): If an alternating series:
$u_1-u_2+u_3-u_4+...\:\:\:(u_n>0)$
if the terms are decreasing
$u_1>u_2>u_3>...>u_n...$ and if $\lim_{n\to\infty}u_n=0$, the series converge and its sum is positive and not superior to the first term.
Problem: Show that $\frac{2x}{1}-\frac{(2x)^2}{2}+\frac{(2x)^3}{3}-...$ converges.
I tried to solve the question using both the Leibniz Theorem and Alembert's(ratio criterion).
If $|x|<1$ then $\frac{(2x)^n}{n}>\frac{(2x)^{n+1}}{n+1}$ and $\lim_{n\to\infty}\frac{(2x)^n}{n}=\lim_{n\to\infty}\frac{(2x)^n\ln(2)}{1}=0$.
So the series converge in $|x|<1$
However using Alembert theorem (ratio test):
$\lim_{n\to\infty}|\frac{\frac{(2x)^{n+1}}{n+1}}{\frac{(2x)^{n}}{n}}|=\lim_{n\to\infty}|\frac{{n}}{n+1}||2x|=|2x|$
The series converge when $|2x|<1\iff|x|<\frac{1}{2}$.
Question:
Why are these two methods delivering different results regarding the convergence dominion? Which one is wrong? Why is that one wrong?
Thanks in advance!
The problem with your application of the test is that the sequence $(2x)^n / n$ is not a decreasing sequence in all cases. In particular, the ratio of terms is
$$\left|\frac{(2x)^{n + 1} / (n + 1)}{(2x)^n / n}\right| = \frac{n}{n + 1} \cdot 2|x|.$$
If $|x| > \frac 1 2$, then once $n$ is large enough (recalling that $\lim_{n \to \infty} \frac n {n + 1} = 1$), this ratio is strictly greater than $1$. As such, the series does not converge.
In fact, this suggests a simpler proof of divergence: If $|x| > \frac 1 2$, then the terms don't even tend to zero.