Where is the nontriviality of Taylor's theorem?

157 Views Asked by At

Note that I am specifically looking at this version of Taylor's theorem:

Let $f: \mathbb{R}\to\mathbb{R}$ be $n$-times differentiable at $x$. Then $\exists\,g : \mathbb{R} \to \mathbb{R}$ where $\lim_\limits{h\to 0}g(x + h) = 0$ and $$f(x + h) = h^n\ g(x + h) + \sum_{k=0}^n\frac{h^k}{k!}\,f^{(k)}(x)$$

My question is, what makes this a nontrivial theorem? Can't we just solve for $g$ in a single step?
Why is the observation that $g$ exists useful at all?

3

There are 3 best solutions below

0
On BEST ANSWER

(Answering my own question since this is something that had been confusing me sometime ago and so I had posted this question for others' reference.)

The nontriviality is in the condition

$$\lim_\limits{h\to 0}g(x + h) = 0$$

The theorem would be trivial if this condition did not need to hold, since we could already easily solve for a $g$ that satisfies the equation by virtue of its $h^n$ coefficient, but it is far from obvious that such a $g$ can necessarily satisfy this condition as well.

Clarification: Note that $h^n$ does not pose a problem for $g$ despite the division by zero when solving for $g$. This is because $g(x) = 0$ could be defined piecewise, avoiding the division by zero. The problem really is just the limit (continuity) requirement on $g$ at $x$, not the mere existence of such a function $g$ that satisfies the equation.

0
On

This is the so-called Taylor-Young theorem, that is the key-result which proves the existence a priori of Taylor expansions !

For example, if we consider the map $f:\mathbb{R}\to\mathbb{R},x\mapsto x+\exp(x)$, it is easy to see that $f$ is a smooth bijection, and that $f'$ doesn't vanish. Therefore, its reciprocal is also smooth and in particular $n-$times differentiable at $0$, for every $n$. Applying the Taylor-Young theorem, we know that $f^{-1}$ has an $n-$order expansion at $0$. The coefficients can be obtained by identification, using the fact that $\forall x\in\mathbb{R},\,f^{-1}(f(x))=x$.

0
On

It's 'surprising' that the error term can be written as $h^n g(x+h)$ because, if you solve for $g$, you get

$$ \frac{f(x + h) - \sum_{k=0}^n\frac{h^k}{k!}\,f^{(k)}(x)}{h^n} = g(x + h) $$

and because of the division, the solution only works for $h \neq 0$. A priori, it's not at all obvious that you can even pick a value for $g(x)$ so that $g$ is continuous at $x$, let alone that $g(x) = 0$ is the value that does so.

The usefulness is that it shows the error in approximation by using the Taylor polynomial is less significant than $h^n$ as $h \to 0$.