An infinite series ⅀ $a_n$ is absolutely convergent if ⅀ $|a_n|$ is convergent. However, just because the absolute value of a series isn't convergent by some test doesn't mean it can't be conditionally convergent without the absolute value. This much I know.
So, when presented with series tests such as the Ratio and Root Tests, I'm a bit confused. They both have to do with the absolute value of a series. For example, the Ratio Test says:
Take $L = \lim\limits_{n \rightarrow \infty} \left |\frac{a_{n+1}}{a_n} \right|$
$\bullet$ If $L < 1,$ the the series converges absolutely.
$\bullet$ If $L > 1,$ the series diverges.
$\bullet$ If $L = 1,$ the test is inconclusive.
My question is, is this divergence absolute? For example, I know that if I put ⅀ $(-1)^n \frac{n!}{9^n}$ through the Ratio Test, I would end up with infinity, which is greater than zero. This would signal that the series diverges by the Ratio Test. Is that the end? Can I try some other test WITHOUT the absolute value that could signal conditional convergence?
If $\lim_{n\to\infty}\sqrt[n]{\lvert a_n\rvert}>1$ or if $\lim_{n\to\infty}\frac{\lvert a_{n+1}\rvert}{\lvert a_n\rvert}>1$, then you don' have $\lim_{n\to\infty}a_n=0$. So, both series $\displaystyle\sum_{n=0}^\infty\lvert a_n\rvert$ and $\displaystyle\sum_{n=0}^\infty a_n$ diverge.