Proving $∆^nf(x_0;h_1,\cdots,h_n)=f^{(n)}(ξ)h_1\cdots h_n$

144 Views Asked by At

Let's define finite differences of order $n$ in $x_0$ for a function $f$ as: $$\Delta ^1 f(x_0;h_1)= f(x_0+h_1)-f(x_0)$$ \begin{align*}\Delta ^2 f(x_0;h_1,h_2)&= \Delta f^1 (x_0+h_2;h_1)-\Delta f(x_0;h_1)\\ &=f(x_0+h_2+h_1)-f(x_0+h_2)-\left [ f(x_0+h_1)-f(x_0) \right ]\end{align*} $$\vdots$$ $$\Delta ^n f(x_0;h_1,\cdots,h_n)= \Delta f^{n-1} (x_0+h_n;h_1,\cdots,h_{n-1})-\Delta^{n-1} f(x_0;h_1,\cdots,h_{n-1})$$

Let's suppose now that $f\in\mathcal{C}^{n-1}([a,b])$ and $\exists f^{(n)}$ in $(a,b)$. I would like to show that if $x_0$, $x_0+h_1$, $x_0+h_2$, $x_0+h_1+h_2$, $\cdots$, $x_0+h_1+\cdots+h_n$ are always inside $[a,b]$), then there exists a point $\xi$ inside the smallest interval containing all of them such that $$\Delta^nf(x_0;h_1,\cdots,h_n)=f^{(n)}(\xi)h_1\cdots h_n$$ I tried by induction but I can't.

Are there other ways that doesn't use the concept of function of more than one variable? Thanks.

EDIT: By induction I succeeded in proving only that $$\Delta^nf(x_0;h_1,\cdots,h_n)\leq f^{(n)}(\xi)h_1\cdots h_n$$

2

There are 2 best solutions below

3
On BEST ANSWER

$\def\d{\mathrm{d}}\def\peq{\mathrel{\phantom{=}}{}}$Lemma: For any differentiable function $g$ and any $0 \leqslant m \leqslant n$,$$ \frac{\d}{\d x}(∆^m g(x; h_1, \cdots, h_m)) = ∆^m g'(x; h_1, \cdots, h_m). $$

Proof: It will be proved by induction. For $m = 0$, there is $\dfrac{\d}{\d x}(∆^0 g(x)) = \dfrac{\d}{\d x}(g(x)) = g'(x) = ∆^0 g'(x)$. Now assume that it holds for $m$, then by the induction hypothesis,\begin{align*} &\peq \frac{\d}{\d x}(∆^{m + 1}g(x; h_1, \cdots, h_{m + 1})) = \frac{\d}{\d x}(∆^m g(x + h_{m + 1}; h_1, \cdots, h_m) - ∆^m g(x; h_1, \cdots, h_m))\\ &= ∆^m g'(x + h_{m + 1}; h_1, \cdots, h_m) - ∆^m g'(x; h_1, \cdots, h_m) = ∆^{m + 1} g'(x; h_1, \cdots, h_{m + 1}). \end{align*} End of induction.

Now back to the problem. For $1 \leqslant k \leqslant n$, define$$ F_k(x) = ∆^k f^{(n - k)}(x; h_1, \cdots, h_k),\quad G_k(x) = ∆^{k - 1} f^{(n - k)}(x; h_1, \cdots, h_{k - 1}), $$ and also define $F_0(x) = f^{(n)}(x)$, then the lemma implies that $G_k' = F_{k - 1}$.

Next it will be proved by induction on $n \geqslant m \geqslant 0$ that $F_n(x_0) = F_m(ξ_m) \prod\limits_{k = m + 1}^n h_k$ for some $ξ_m$. For $m = n$, it suffices to take $ξ_n = x_0$. Assume that it holds for $m$ ($m \geqslant 1$). The mean value theorem implies that there exists $ξ_{m - 1}$ satisfying$$ F_m(ξ_m) = G_m(ξ_m + h_m) - G_m(ξ_m) = G_m'(ξ_{m - 1}) h_m = F_{m - 1}(ξ_{m - 1}) h_m, $$ thus $F_n(x_0) = F_m(ξ_m) \prod\limits_{k = m + 1}^n h_k = F_{m - 1}(ξ_{m - 1}) \prod\limits_{k = m}^n h_k$. End of induction.

Finally, taking $m = 0$ yields$$ ∆^n f(x_0; h_1, \cdots, h_n) = F_n(x_0) = F_0(ξ_0) \prod_{k = 1}^n h_k = f^{(n)}(ξ_0) \prod_{k = 1}^n h_k. $$

1
On

I had an idea that was longer than comment. But not a complete solution. I felt it may be helpful to you so I’ve provided it below:

This is essentially applying the mean value theorem with induction.

Let’s define the “divided difference” of a function as

$$D_h(f) = \frac{f(x+h)-f(x)}{h} $$

If we consider the interval $[x_0, x_0+h]$ (and the order here might be wrong since h could be negative, so we should really say “minimum interval containing both”) and let $f$ be differentiable over this interval then the mean value theorem applies and we have that there exists a $\eta \in [x_0, x_0+h]$ such that

$$ f’(\eta) = D_h(f)(x_0) = \frac{f(x_0 + h)-f(x_0)}{h} $$ Or in your notation (and you’ll see why we are doing this)

$$ h f’(\eta) = f(x_0 + h)-f(x_0) = \Delta f(x_0 : h)$$

You can go further and say that there is some function $\eta(x)$ such that:

$$ h f’(\eta(x)) = \Delta f(x:h)$$

And we have that $\eta(x)$ is always contained in the smallest interval containing $x$,$x+h$

So now the to continue building our intuition let’s Say we have some g(x) = $ \Delta f(x: h)$. It is the case that over the minimum interval containing $x_0, x_0+h, x_0+h_2$ that there is some $c$ in the interval such that

$$ \Delta(g:h_2)(x_0) = h_2 g’(c)$$

Filling in the definition of g we have that:

$$ \Delta(f: h_1 : h_2)(x_0) = h_2 ( \Delta f(x: h))’(c) $$

$$ \Delta(f: h_1 : h_2)(x_0) = h_2 h_1 (f’(\eta(x)))’(c) $$

Which means:

$$ \Delta(f: h_1 : h_2)(x_0) = h_2 h_1 f’’(\eta(c)) \eta’(c)$$

Now the problem is that $\eta’(c)$ at the very end. There should be some way to absorb it by showing that $f’’(\eta(x)) \eta’(x) = f’’(m) $ for some suitable choice of m, at which point you can then use induction on the strategy we have demonstrated to cover the arbitrary case.