My book explains the Leibniz test by saying:
"Assume a sub n is a positive sequence that converges to 0..."
And goes on to say that that means the alternating series converges. What if the sequence doesn't go to 0? Does the Leibniz Test say that the series diverges in that case?
I'm trying to determine if the following converges or diverges:
$$\sum\limits_{n=0}^\infty \frac{(-1)^n*n}{\sqrt{n^2+1}}$$
I know from the book's answer that it diverges, but I don't know how I was supposed to determine that.
If a series $\sum a_n$ converges, then $a_n$ must go to zero. This is not the case in the example you posted, so the series cannot converge.
By the way, the Leibniz test also requires the sequence to be monotonically decreasing.