Question: Consider the series $g_n(x)=\sum_{k=0}^n\cfrac{x^2}{(1+x^2)^k}$ Prove that the series converges pointwise to the function $$g(x)=\begin{cases} 0 & \text{ if } x=0 \\ 1+x^2 & \text{ if } x \neq 0 \end{cases}$$ but the convergence is not uniform on any interval containing $0$ on its interior.
Help? Not even sure what the first step is on this one.
I'm assuming you're defining $$g_n(x)=\sum_{k=0}^n \frac{x^2}{(1+x^2)^k}$$
It is not hard to see that if $x=0$; $g_n(x)=0$ for each $n$ whence $g_n\to 0$. If $x\neq 0$, we then have that
$$ \lim \;g_n(x)=\sum_{k=0}^\infty \frac{x^2}{(1+x^2)^k}$$
$$ =x^2\sum_{k=0}^\infty \left(\frac{1}{1+x^2}\right)^k$$
$$ =x^2\frac{1}{1-\frac{1}{1+x^2}}$$
$$ =x^2\frac{1+x^2}{1+x^2-1}$$
$$ ={1+x^2}$$
and all the manipulations are justified for $$\frac 1 {1+x^2}<1$$ for any $x\neq 0$.
Can you see why the convergence is not uniform? If it were, your function would be continuous on an interval containing the origin.