Using Taylor series to create a zero function

398 Views Asked by At

Let $f(t)$ be a $n$th order polynomial with real, positive coefficients (I am not sure if these constraints are necessary).

Then after taking $n+1$ or more derivatives, the function vanishes and is identically equal to zero. I am reading a book which claims that if $t_{0}$ is a root of $f$, then

$$0 = f(t) + \sum_{k=1}^{n}\frac{f^{(k)}(t)}{k!} (t_{0}-t)^{k}$$

and I have no idea how this could possibly be true.

I am familiar with the usual formula for a Taylor series centered at $t_{0}$,

$$f(t) = f(t_{0}) + f^{\prime}(t_{0})(t-t_{0}) + f^{\prime\prime}(t_{0})\frac{(t-t_{0})^{2}}{2!} + \dots$$

but the proposed formula seems to exchange the roles of $t$ and $t_{0}$ which seems difficult for me to believe this should hold for all $t$. I have tried using this formula where $t_0$ is not a root of $f$ and it clearly fails, but when $t_{0}$ is a root everything seems to work out for some low order polynomials I wrote out and I cannot fathom why.

Edit: Just for a clarification, when I say it doesn't work for when $f(t_{0})$ is not a root, I mean the formula

$$f(t_{0}) = f(t) + \sum_{k=1}^{n}\frac{f^{(k)}(t)}{k!} (t_{0}-t)^{k}$$

does not work by accounting for the fact that $f(t_{0})$ is no longer zero.

1

There are 1 best solutions below

1
On BEST ANSWER

Essentially, all that's going on is a notational issue. To clarify that, let's start with your expression for the Taylor series, but with the independent variable and the point of expansion instead labelled as $x$ and $t$ respectively:

$$f(x) = f(t) + f^{\prime}(t)(x-t) + f^{\prime\prime}(t)\frac{(x-t)^{2}}{2!} + \cdots=\sum_{k=0}^\infty \frac{f^{(k)}(t)}{k!}(x-t)^k$$

Now suppose that $f(x)=0$ has a solution $x=t_0$. We can then evaluate the Taylor series at this point, which obtains the desired equality

$$0 = f(t) + f^{\prime}(t)(t_0-t) + f^{\prime\prime}(t)\frac{(t_0-t)^{2}}{2!} + \cdots=\sum_{k=0}^\infty \frac{f^{(k)}(t)}{k!}(t_0-t)^k.$$