So I have the function $f(x) = \frac{1}{x^2} $ and I want to represent it as a Taylor series [centered @ x=1] (also evaluating at $x = 1.02$). $$$$I did it using the standard Taylor series method:
$$ \sum_{n=0}^\infty\frac{f^n(a)}{n!}(x-a)^n = 1 -2(x-1)+3(x-1)^2-4(x-1)^3+...$$
And I did it using algebraic manipulations to get it to match a geometric series:
$$ \frac{1}{x^2}=\frac{1}{1-1+x^2} = \frac{1}{1-(1-x^2)} = \sum_{n=0}^\infty (1-x^2)^{n} $$ Or:
$$ \frac{1}{x^2}=\frac{1}{1-1+x^2} = \frac{1}{1-(-(x^2-1))} = \sum_{n=0}^\infty (-(x^2-1))^{n} = \sum_{n=0}^\infty (-1)^n(x^2-1)^{n} $$
So these 2 end up being pretty close when using $T_n(x)'s$ but they're not the same. Did I do something wrong? $$$$ Thank you for your help!
You may use your "trick" but in a slightly different way.
To do so, note that
Now you get $$\frac{1}{x^2} = -\left( \sum_{n=0}^{\infty}(-1)^n(x-1)^n \right)' = \sum_{n=\color{blue}{1}}^{\infty}(-1)^{n+1}n(x-1)^{n-1} = \sum_{n=\color{blue}{0}}^{\infty}(-1)^{n}(n+1)(x-1)^{n}$$