I have this series: $\sum_{i=1}^n \frac{(-2a)^n}{n^2}$ and the question is for what values of $a$ does the series converge. I'm supposed to write the interval $a$ is in. I first tried the ratio test:
$$ \frac{(-2a)^{n+1}}{(n+1)^2} \cdot \frac{(n)^2}{(-2a)^n} = \frac{-2a \cdot n^2}{(n+1)^2} = \frac{-2a \cdot n^2}{n^2+2n+1}.$$
I then divided everything with $n^2$:
$$ \frac{-2a \cdot n^2}{n^2+2n+1} = \frac{-2a}{1+2/n+1/n^2},$$
and when I took the limit as $n \to \infty$, I ended up with the inequality:
$$ -2a < 1. $$
I divided with $-2$ on both side, and got that $a$ is $-1/2$, but I'm supposed to make an interval, what did I do wrong?
You want to take absolute values in the ratio test. The result should be that it converges if $|-2a| < 1$, i.e. $-1/2 < a < 1/2$, and diverges if $|-2a| > 1$. But you also need to look at the case $|-2a|=1$, where the ratio test is inconclusive.