If $x_n \to c$ and and $f(x_i)=f(c)$, then all the derivatives at $a$ vanish

75 Views Asked by At

My book claims the following

Let $x_n \in (a,b)\setminus\{c\}$ be a sequence that converges to $c \in (a,b)$. If $f: (a,b) \to \mathbb{R}$ satisfies that $f(x_i)=f(c)$ for all $i$, then all the derivatives at $c$ that exist vanish.

This seems right, but I haven't been able to prove it, the book just says that it follows from Taylor's formula. What's a proof of this, preferably using Taylor's formula?


What I tried is use the formula on the $f(x_i)$, so we get that we have a sequence $h_n=x_n - c$ with $h_n \to 0$ and $$f(x_k)=f(c+h_k)=\sum^n_{i=0}\frac{f^{(i)}(c)}{i!}h^i_k+r(h_k)=f(c)$$

Making $k$ go to infinity doesn't seem to accomplish anything, yet there doesn't seem like there's more to do with the data available.

1

There are 1 best solutions below

1
On BEST ANSWER

Induct on $k$. We begin with $k = 1$. Suppose that $f(x)$ is differentiable in at $c$. Then (one version of) Taylor's theorem says that $$ f(x) = f(c) + f'(c) (x - c) + h_1(x)(x-c), $$ where $h_1(x)$ is some error term such that $h_1(x) \to 0$ as $x \to c$. Rearranging, we have $$ f'(c) = \frac{f(x) - f(c)}{x - c} + h_1(x). $$ Now evaluating on the sequence of $x_j \to c$ shows that $f'(c) = 0$.

The input from Taylor's theorem is that $h_1(x) \to 0$.

Essentially the same argument holds for larger $k$. Having shown that the first $k$ derivatives all vanish, Taylor's theorem would show in the $(k+1)$ case that $$ f(x) = f(c) + f^{(k+1)}(c) \frac{(x-c)^{k+1}}{(k+1)!} + h_{k + 1}(x) (x-c)^{k+1}. $$