Are there alternative proofs of the general Taylor-series expansion theorem for real functions?

298 Views Asked by At

With a view to better understanding real Taylor series, I have examined some books on basic Calculus, with an eye for the proofs of the Taylor series theorem and the possible authors' comments on its derivation. (My reaction when I first saw a proof of it, many years ago, was a mixture of great surprise and anxiety. And still, while I understand the individual steps, the way they all combine to produce e.g. the series for sinx strikes me as little short of miraculous.)

Up to now, from the books I have seen, I get the same impression: that this theorem is a technical exercise in repeated applications of the mean-value theorem. And we are lucky that some useful functions happen to have all derivatives bounded, so the remainder tends to zero and a nice series occurs, with nothing else to be said. But some authors do place some comments close to what I feel, albeit not very encouraging, e.g:

from Calculus, by Karl Menger: "Taylor's formula (...) is one of the great marvels of mathematics. (...) This is something like a mathematical action at a distance (...)"

from Real Analysis, by Laczkovich & Sós: "The statement of Theorem (...) is actually quite surprising (...) the derivatives of f at a alone determine the values of the function at every other point (...)"

from Introduction to the Calculus, by Osgood: "(...) Since it took the race two centuries to develop this formula after the Calculus was invented, the student will not be surprised that the reasons which underlie it cannot be given him in a few words. Let him accept it as a deus ex machina."

Now all this inquiry may be overly romantic and obsessive on my part, and Taylor series be a perfect example of the "cold and austere beauty of mathematics" as Russell has expressed. But I think that sharing mental experiences helps the mind to improve its turns and horizons, so may I ask:
What was your reaction when you first saw this theorem? And has your general understanding of it changed ever since, by some other way of looking at it and proving it?

2

There are 2 best solutions below

1
On BEST ANSWER

It's simple to discover Taylor series. Let's start with $$ \tag{1}f(x) = f(a) + \int_a^x f'(s) \, ds, $$ which of course is just the fundamental theorem of calculus. Now if we are feeling playful we might note (again by FTC) that $f'(s) = f'(a) + \int_a^s f''(t) \, dt$. Plugging this into (1), we find that \begin{align} f(x) &= f(a) + \int_a^x f'(a) + \int_a^s f''(t) \, dt \,ds \\ \tag{2}&= f(a) + f'(a)(x - a) + \underbrace{\int_a^x \int_a^s f''(t) \, dt \, ds}_{\text{remainder}}. \end{align} We can keep going like this for as long as we want. The next step is to note that $f''(t) = f''(a) + \int_a^t f'''(u) \, du$. Plugging this into (2), we find that \begin{align} f(x) &= f(a) + f'(a) (x - a) + \int_a^x \int_a^s f''(a) + \int_a^t f'''(u) \, du \, dt \, ds \\ &= f(a) + f'(a)(x - a) + \int_a^x f''(a)(s - a) + \int_a^s \int_a^t f'''(u) \, du \, dt \, ds \\ &= f(a) + f'(a)(x - a) + f''(a) \frac{(x-a)^2}{2} + \underbrace{\int_a^x \int_a^s \int_a^t f'''(u) \, du \, dt \, ds}_{\text{remainder}}. \end{align} You see the pattern. So we have discovered the Taylor polynomial approximation to $f(x)$, and we have a formula for the remainder.


By the way, if $| f'''(u) | \leq M$ for all $u \in [a,x]$, then the remainder $R(x)$ satisfies \begin{align} | R(x) | &\leq \int_a^x \int_a^s \int_a^t | f'''(u) | \, du \, dt \, ds \\ &\leq \int_a^x \int_a^s \int_a^t M \, du \, dt \, ds \\ &= M \frac{(x-a)^3}{3!}. \end{align} You see what the bound on the remainder will be for higher order Taylor series approximations. So we see that the remainder will be small if $x$ is close to $a$.

(If $f$ is sine or cosine, we can take $M = 1$. If $f$ is the exponential function, we can take $M = e^x$.)

4
On

Consider the function $p(x)$=$\sum_{i=0}^\infty c_i(x-a)^i$=$c_0+c_1(x-a)+c_2(x-a)^2...$

Here $c_i$ represents some scalar.

Consider an arbitrary function $f(x$). We want $p(x)$=$f(x)$ for all x $\epsilon$ $R$. In other words, we want p(x) to "match" f(x). We will assume that this function $f(x)$ is "nice enough" in that it is defined on all $R$ and is of class $C^\infty$ (meaning that it is continuously differentiable).

We start by picking a point a $\epsilon$ $R$. Note that then $p(a)$=$c_0$. Now if $p(x)$ is to equal $f(x)$ on $R$, we must have $p(a)$=$f(a)$, so it follows that $c_0$=$f(a)$.

Now consider $p ' (x)$=$\sum_{i=1}^\infty c_ii(x-a)^{i-1} $=$c_1+2c_2(x-a)+3c_3(x-a)^2...$

If $p(x)$ is to equal $f(x)$ on R, we need $p'(a)$=$f'(a)$. The slope of the tangent lines to both functions should have the same value!

Since $p'(a)$=$c_1$, it is clear that $c_1$=$f'(a)$.

Now we look at $p ' '(x)$= $\sum_{i=2}^\infty c_i(i)(i-1)(x-a)^{i-2} $=$2(1)c_2+(3)(2)c_3(x-a)+c_4(4)(3)(x-a)^2...$

If $p(x)$ is to equal $f(x)$ on R, we need $p ' '(a)$=$f ''(a)$ (The instantaneous rates of change of the first derivatives of each function should be equal). Since $p''(a)$=$2(1)c_2=2!c_2$, it follows that $c_2$=$f''(a)\over2!$. The reason why I wrote 2(1) as 2! should also become immediately clear in the next paragraph.

Let's look at the general nth derivative of $p(x)$, denoted $p^{(n)}$(x).

$p^{(n)}$(x)= $\sum_{i=n}^\infty c_i(i)(i-1)...(i-(n-1))(x-a)^{i-n} $=$n!c_n+c_{n+1}(n+1)(n)...(2)(x-a)+...$

We set $p^{(n)}$(a)=$f^{(n)}$(a). Since $p^{(n)}$(a)=$n! c_n$, we have that $c_n$= $f^{(n)}(a)\over n!$.

Using the above facts, we can rewrite $p(x)$ as:

$p(x)$=$\sum_{i=0}^\infty {f^{(i)}(a)\over i!}(x-a)^{i}$= $f(a)$+$f'(a)(x-a)$+ $f''(a)\over2!$$(x-a)^2$ +.... +$f^{(n)}(a)\over n!$ $(x-a)^n$+...

(Note that $ f^{(0)} $(a)=$f(a)$)

$p(x)$ is the Taylor series of $f(x)$ at x=a. The formula should be completely intuitive. If two functions p(x) and f(x) are to "match" one another, then all of their derivatives at a single point x=a must equal.

On a final note, in general, $p(x)$ will not equal a function for all x $\epsilon$ $R$ that does not satisfy the requirements stated in the second paragraph.