This showed up as an optional challenge problem from my class:
Show that $\displaystyle \sum_{n=1}^\infty a_n$, $a_n>0$ converges if $\limsup(\sqrt[n]{a_n})<1$ and diverges if $\limsup(\sqrt[n]{a_n}) > 1.$
I have a feeling I could use Weierstrass M-test, but I'm not sure where to start with this.
This is the so called "Root Test", see Rudin's Principles of Mathematical Analysis chapter $3$.
The idea of the proof is the comparison test rather than the M-test. You just want to compare to a series you know converges, namely the geometric series. Let $a=\limsup \sqrt[n]{|a_n|}$. Then if $a<1$, choose $\epsilon\in (a,1)$. Then there is some large $N$ such that $\sqrt[n]{|a_n|}<\epsilon$ for all $n\geq N$ by definition of the $\limsup$. This implies that $|a_n|<\epsilon^n$. Now compare to the series $\sum \epsilon^n<\infty$. By the comparison test, this converges, so $\sum a_n$ converges.
The case $a>1$ can be handled similarly, or by noting that there are infinitely many values for which $|a_n|>1$, so in particular $\lim a_n\neq 0$, so the series cannot converge (this is the proof in Rudin, but a comparison test argument would work as well).