What is the interval of convergence for power series

477 Views Asked by At

I have a power series and I need to find the interval of convergence. $$\sum_{n=0}^\infty= \frac {n}{(n^2-1) (1+x)^n} $$ I tried ratio test with a new variable $ t = \frac{1}{1+x} $and I get that radius of convergence is one ( $R= 1$) so the interval of convergence is $ x \in [-2,0] $. But it doesn't seem right, because if I put $x=4 $ by ratio test this converges. So my question is what is the interval of convergence? And how do you get it? Thank you.

1

There are 1 best solutions below

2
On BEST ANSWER

Just consider another power seriese $$ g(y) = \sum_{n = 2}^\infty \frac{n}{n^2 - 1} \cdot y^n $$ This is a standard power series and converges in $[-1,1[$. Now to determine where your series converges, just notice that you need to apply the change of variable $ \displaystyle y = \frac{1}{x+1}$ so the condition $y \in [-1,1[$ is equivalent to $ x \in \; ]-\infty, -2] \; \cup \; ]0, +\infty[$