Given $a_n = b_n * (-1)^n$, if $b_n$ converges to $L$ (which is not 0) then $a_n$ does not converge(diverges).
I was learning about alternating series by myself, and I came up with this statement which would be very helpful if is true.
Intuitively, it seems to be true, but I don't know how to prove it clearly.
I came up with an idea "if n is large enough, $b_n$ would be very close to $L$ and $a_n$ would be alternating between (some value) and (some value + $L$) so $a_n$ does not converge", but this explanation seems to be too vague and not logical.
You are correct. To make it rigorous:
If $b_n \to L \ne 0$, take $\varepsilon$ such that $|L| > \varepsilon > 0$. Then for sufficiently large even $n$ we have $|a_n - L| < \varepsilon$ and for sufficiently large odd $n$, $|a_n + L| < \varepsilon$.
If $\lim_{n \to \infty} a_n = R$ existed, we'd need both $|R - L| \le \varepsilon$ and $|R+L| \le \varepsilon$, and this is impossible because the intervals $[L-\varepsilon, L+\varepsilon]$ and $[-L-\varepsilon, -L+\varepsilon]$ are disjoint.