When using the ratio test for absolute convergence of a series $\sum_{n=1}^\infty a_{n}$, if the limit of the ratio $$|a_{n+1}|/|a_{n}|=1$$ when $n \rightarrow \infty$, the fate of the series is indeterminate. However, if $$|a_{n+1}|\ge|a_{n}|$$ for all sufficiently large values of $n$, does that imply that the series is divergent?
For instance, if I am correct, the series $$\sum_{n=1}^\infty (n!/n^n)x^n$$ converges for $|x|<e$ and diverges if $|x|>e$.
But, when $|x|=e$, the limit of the ratio $=1$ and $$|a_{n+1}|=|a_{n}|e(n/(n+1))^n$$ where $(n/(n+1))^n>1/e$ for all values of $n$, so that $|a_{n+1}|\ge|a_{n}|$, and the series is then divergent. Is this correct?
On the other hand, if $|a_{n+1}|<|a_{n}|$ for large values of $n$, we cannot conclude, is this true?
If $\vert a_{n+1} \vert \geq \vert a_n\vert$ when $n>N$, then for all sufficiently large $n$, $\vert a_n \vert$ is greater than a constant $a_N$. Thus, $\lim_{x\to\infty}a_n\not=0$. (Unless in the trivial case that $a_n=0$ for all sufficiently large $n$.) Hence $a_n$ diverges.
If $\vert a_{n+1} \vert \lt \vert a_n\vert$ for large $n$, then we cannot conclude. For instance, $\sum\frac{1}{n}$ diverges, but $\sum\frac{1}{n^2}$ converges.