Wikipedia wrong? Convergence of finite difference

489 Views Asked by At

Update: I have edited the Wikipedia page, so that the mistake no longer appears.

On the Wikipedia article for "Finite difference" there is the claim

Assuming that $f$ is continuously differentiable, [we have] $$ \frac{f(x+h) - f(x)}{h} - f'(x) = O(h) \quad \text{as}\,\, h \to 0.\tag{1} $$ The central difference gives a more accurate approximation. [Supposing that $f$ is $C^2$] $$ \frac{f\left(x+\frac12 h \right)- f \left( x - \frac12 h \right)}{h} - f'(x) = O(h^2). \tag{2} $$

I think these are false. In the case of (1), consider $\phi(x) = \int_0^x \xi^{1/2} d\xi$. It is $C^1$ on $[0,\infty)$, but $$ \frac{\frac{\phi(0+h) - \phi(0)}{h} - \phi'(0)}{h} \quad \text{is unbounded as } h \to 0. $$

I believe that (2) is false for similar reasons, and I think a counterexample is $\psi(x) = \int_0^x \int_0^\xi \eta^{1/2}d\eta d\xi$, but I haven't worked it out.

Could someone tell me what the correct statement of these things is, or explain to me why I'm wrong and Wikipedia is right?

2

There are 2 best solutions below

1
On BEST ANSWER

You are right. In the first case, for

$$\frac{f(x+h)-f(x)}{h} - f'(x)$$

you only have an $o(1)$ bound. A function like $f(x) = x\cdot\lvert x\rvert^\alpha$ for $0 < \alpha < 1$ is continuously differentiable on all of $\mathbb{R}$, but at $0$ the difference quotient converges only of the order $\lvert h\rvert^{\alpha}$ to the derivative.

In the second case, choosing $1 < \alpha < 2$ gives a twice continuously differentiable function with

$$\frac{f\left(x + \tfrac{h}{2}\right) - f\left(x - \tfrac{h}{2}\right)}{h} - f'(x) \in \Theta(h^{\alpha}).$$

The order of convergence under the assumption of differentiability resp. twice differentiability is

$$\frac{f(x+h)-f(x)}{h} - f'(x) \in o(1)$$

resp.

$$\frac{f\left(x + \tfrac{h}{2}\right) - f\left(x - \tfrac{h}{2}\right)}{h} - f'(x) \in o(h),$$

nothing better is to be had without stronger assumptions.

3
On

Always go to Taylor's theorem, which says that if $f(x)$ is differentiable, then

$f(x+h)=f(x)+f'(x)h+o(h)$.

Similarly if $f$ is $n$ times differentiable, you have:

$$f(x+h)=f(x)+f'(x)h+\cdots+\frac{f^{(n)}(x)}{n!}h^n+o(h^n).$$

Going to the $n=1$ example, notice that even if if is not twice differentiable, the theorem still applies: the error will be $o(h)$. On the other hand, this theorem will fail at any point $x$ where $f$ is not differentiable. You have a function $\phi(x)=\int_0^x\xi^{1/2}\mathrm{d}\xi$. Integrals like this are well known to be differentiable (by fundamental theorem of calculus). But, you seem to be differentiating it twice, and look what happens: $\phi'(x)=x^{1/2}$,

$$\phi''(x)=\frac{1}{2x^{1/2}}.$$

$\phi''(0)$ doesn't exist: the derivative blows up there, hence you're seeing your fraction blow up.