Proof that using Richardson extrapolation once yields Simpson's rule

461 Views Asked by At

Given that $\int_{a}^{b}f(x)\,dx \approx\frac{h}{2}\left[f(b) + f(a) \right]$, it must be shown that $\int_{a}^{b}f(x) \, dx \approx \frac{h}{6}\left[f(b) + 4f\left[\frac{a+b}{2} \right ] + f(a) \right ]$ by the use of Richardson extrapolation once, with $h, \frac{h}{2}$. I attempted this problem by using the formula $\mathcal{Q}(i, j+1) = \mathcal{Q}(i, j) + \frac{\mathcal{Q}(i, j) - \mathcal{Q}(i-1, j)}{2^{2(j+1)}-1}$ but it ended up not working out. How can I go further with this?

1

There are 1 best solutions below

0
On

Using the sample points $a,\frac{a+b}2,b$ and $h=(b-a)$, you combine the approximations with coefficient sequence $\frac h2 [1,0,1]$ with error $Ch^2+O(h^4)$ and $\frac h4[1,2,1]$ with error $\frac14Ch^2+O(h^4)$.

To eliminate the leading error term $Ch^2$ you subtract the first from 4 times the second, and have then to divide by 3.