Does an alternating sequence diverge when its magnitude does not converge to 0?

218 Views Asked by At

Given $a_n = b_n * (-1)^n$, if $b_n$ converges to $L$ (which is not 0) then $a_n$ does not converge(diverges).

I was learning about alternating series by myself, and I came up with this statement which would be very helpful if is true.

Intuitively, it seems to be true, but I don't know how to prove it clearly.

I came up with an idea "if n is large enough, $b_n$ would be very close to $L$ and $a_n$ would be alternating between (some value) and (some value + $L$) so $a_n$ does not converge", but this explanation seems to be too vague and not logical.

2

There are 2 best solutions below

1
On

You are correct. To make it rigorous:

If $b_n \to L \ne 0$, take $\varepsilon$ such that $|L| > \varepsilon > 0$. Then for sufficiently large even $n$ we have $|a_n - L| < \varepsilon$ and for sufficiently large odd $n$, $|a_n + L| < \varepsilon$.
If $\lim_{n \to \infty} a_n = R$ existed, we'd need both $|R - L| \le \varepsilon$ and $|R+L| \le \varepsilon$, and this is impossible because the intervals $[L-\varepsilon, L+\varepsilon]$ and $[-L-\varepsilon, -L+\varepsilon]$ are disjoint.

0
On

Without loss of generality, I'll assume $L>0$.

Just note that, given any $\varepsilon>0$, there exists $n_0(\varepsilon)$ such that for any $n>n_0$ we have:

$$ |b_n-L|<\varepsilon $$

$b_n\in [L-\varepsilon,L+\varepsilon]$

Now for odd $n$, $a_n=-b_n\in [-L-\varepsilon,-L+\varepsilon]$ While for even $n$, $a_n=b_n\in [L-\varepsilon,L+\varepsilon]$

Then if I were to assume that there exists $C$ such that for any $n>n_1$:

$$ |a_n-C|<\varepsilon_1 $$

Thus $C\in[-L-\varepsilon,-L+\varepsilon] $ and $C\in[L-\varepsilon,L+\varepsilon] $

If I assume $\varepsilon = L/2$, then: Thus $C\in[-3L/2,-L/2] $ and $C\in[L/2,3L/2] $

Since these intervals are disjoint, we've reached a contradiction, so there can be no such $C$.