Suppose that,
$$f(x)=\sum_{n=0}^{\infty}a_nx^n$$
converges on $(-R,R)$ for some $R>0$. If we let $(x_n)$ be a sequence in $(-R,R)$ with $x_n\ne 0$ but $\lim x_n=0$. If $f(x_n)=0$ for all $n\in N$, I need to show that $f(x)=0$ for all $x\in (-R,R)$.
My thinking is that I could use Rolle's Theorem. This theorem states that if we let $f$ be continuous on a closed interval $[a,b]$ and differentiable on the open interval $(a,b)$. So if $f(a)=f(b)$ then there is at lest one point $c$ in $(a,b)$ where $f'(c)=0$
Can I use this logic? and if so how?
If $f$ is not identically zero, there must be some minimal $N$ such that $a_N\neq0$. This implies $$f(x)=x^N\sum_{n=0}^\infty a_{n+N}x^n.$$ In particular, we have $\lim_{x\to0}\frac{f(x)}{x^N}=a_N$. Since $x_k\neq0$ and $f(x_k)=0$, this implies $$0=\frac{f(x_k)}{x_k^N}\to a_N\qquad\text{as }k\to\infty,$$ that is, $a_N=0$ which is a contradiction. This completes the proof.