Why must it be true that approximating all the derivatives gives you the original function?

109 Views Asked by At

I am trying to understand the intuition behind Taylor/Maclaurin series.

You have some differentiable function $f(x)$ and you want to make a series $g(x)$ where $f^{n}(x) = g^{n}(x)$, i.e. the $n$th derivatives of each give you the same output for some input.

Assuming we have this matching derivative output concept in place, how do we know this necessarily means $f(x)$ and $g(x)$ are equivalent representations of each other?

Normally these approximations are made in the neighborhood of $x=0$ (and yes we could use $x=a$ but for simplicity I'd like to stick with $0$), so it makes sense that $f(x)$ and $g(x)$ are equal for any $n$th derivative you want to compute at $x=0$ since that is how we derived $g(x)$ in the first place.

But what exactly lets us then take $g(x)$ and say "This will also work for any other $x$, not just $0$, since it is an equivalent to $f(x)$"?

In other words I don't see why it is obvious that through the method of creating the Taylor/Maclaurin series $g(x)$ we must necessarily have an equivalent for $f(x)$.

3

There are 3 best solutions below

0
On

The property of functions that make this possible is called "analyticity". In general, it is not true that a function is determined by its value and derivatives at a point. A famous example is the function $f(x)=e^{-1/x^2}$ when $x\neq 0$ and $f(0)=0$. It can be shown that all the derivatives of this function vanish at $x=0$, but obviously the function is not identically zero. So it is not "analytic". Analytic functions, like $e^x$, for example, can indeed be represented by an infinite power series whose coefficients depend only on the values of the function and its derivatives at a single point.

5
On

It's not obvious, and in fact it isn't true. You can't conclude that $f(x)=g(x)$ just because all their derivatives agree at some particular point. For instance, let $$f(x)= \begin{cases} 0 & \text{ if }x\leq 0 \\ e^{-1/x} & \text{ if }x>0.\end{cases}$$ Then $f$ is infinitely differentiable, and in fact $f^{(n)}(0)=0$ for all $n$, so the Taylor series around $0$ is just $0$. But if $x>0$, then $f(x)$ is not equal to this Taylor series!

More generally, a typical infinitely differentiable function is not equal to its Taylor expansion about a point. A function which is equal to its Taylor expansion in a neighborhood of any point is very very special and is called analytic.

In fact, even more strongly, the Taylor series of a typical infinitely differentiable function about a point usually won't even converge anywhere (except at the point itself). For instance, it's possible to construct an infinitely differentiable function $f:\mathbb{R}\to\mathbb{R}$ with $f^{(n)}(0)=(n!)^2$. The Taylor series around $0$ is then $\sum_n n! x^n$ which does not converge for any nonzero $x$.

What you can say, though, is that if it is possible to represent a function $f(x)$ as a power series $\sum_n a_nx^n$, then that power series must be the Taylor series for $f$ around $0$. This is just because you can differentiate the power series term-by-term (though this takes some work to make rigorous) to show that if $f(x)=\sum_n a_n x^n$ for all $x$ in a neighborhood of $0$, then $f^{(k)}(x)=\sum_n n(n-1)\dots(n-k-1)a_nx^{n-k}$ as well and so plugging in $x=0$ gives $f^{(k)}(0)=k!a_k$. That is, $a_k=\frac{f^{(k)}(0)}{k!}$. So if you think that your function $f$ is nice enough to be represented by some power series, then the Taylor series is the only power series that could work.

0
On

Since you are asking for an intuitive explanation, then consider a polynomial of degree $n$.
Such a polynomial is uniquely defined by $n+1$ points it is passing through, i.e. it is the same as the interpolating polynomial passing through those points.
How the $n+1$ points are chosen does not matter from a theoretical point of view (computationally could create some difficulties).
Whichever method you use to construct the interpolating polynomial (Newton, finite differences, Lagrange etc.) you end up to get the expression of the original polynomial which coincides with its Taylor series.

That means that the $n+1$ points can be chosen to be "very near" to each other and to the origin, and the interpolating polynomial in the limit will become the one which has the same $f_{(0)}(x),\,f_{(1)}(x),\, \cdots, \,f_{(n)}(x)$.

That premised, any function which can be represented as a polynomial, in the limit of "infinite degree", or more rigourously an analytic functions as rightly cited in the above answer, will coincide with its Taylor series.