For which values of $a$ does the series $\sum_{i=1}^n \frac{(-2a)^n}{n^2}$ converge?

42 Views Asked by At

I have this series: $\sum_{i=1}^n \frac{(-2a)^n}{n^2}$ and the question is for what values of $a$ does the series converge. I'm supposed to write the interval $a$ is in. I first tried the ratio test:

$$ \frac{(-2a)^{n+1}}{(n+1)^2} \cdot \frac{(n)^2}{(-2a)^n} = \frac{-2a \cdot n^2}{(n+1)^2} = \frac{-2a \cdot n^2}{n^2+2n+1}.$$

I then divided everything with $n^2$:

$$ \frac{-2a \cdot n^2}{n^2+2n+1} = \frac{-2a}{1+2/n+1/n^2},$$

and when I took the limit as $n \to \infty$, I ended up with the inequality:

$$ -2a < 1. $$

I divided with $-2$ on both side, and got that $a$ is $-1/2$, but I'm supposed to make an interval, what did I do wrong?

2

There are 2 best solutions below

0
On

You want to take absolute values in the ratio test. The result should be that it converges if $|-2a| < 1$, i.e. $-1/2 < a < 1/2$, and diverges if $|-2a| > 1$. But you also need to look at the case $|-2a|=1$, where the ratio test is inconclusive.

0
On

We need that

$$\frac{(-2a)^n}{n^2} \to 0 \implies |2a|\le 1$$

then

  • for $-1<2a<0$ the series converges by comparison test with $\sum \frac1{n^2}$

  • for $0<2a<1$ the series converges by alternating series test