Error of a quadrature applied to an approximation of a function with its own error

98 Views Asked by At

This has come up with a math modeling project I am doing. To try and boil it down, suppose I am using a numerical quadrature such that

$ \int_0^1 f(x) dx = \sum_{i=1}^n f(x_i) + O(h^n)$

where $O(h^n)$ is the error associated with the quadrature. Additionally, suppose that the function being integrated is rather nasty to evaluate itself, hence we replace it with a numerical estimate $g(x)$ that has it's own error, e.g.

$f(x) = g(x) + err(x)$

Then I would like to consider the error of using the quadrature on $g(x)$. Naively,

\begin{align*} \int_0^1 f(x) dx =& \int_0^1 g(x) + err(x) dx \\ =&\int_0^1 g(x) dx + \int_0^1 err(x) dx\\ =& \sum_{i=1}^n g(x_i) + O(h^n) + \int_0^1 err(x) dx \\ \end{align*}

so then it seems like my end estimate is

$\sum_{i=1}^n g(x_i) = \int_0^1 f(x) dx - \left(+ O(h^n) + \int_0^1 err(x) dx\right)$

and the error is $\left(+ O(h^n) + \int_0^1 err(x) dx\right)$. Does something like this seem reasonable? Can anybody point me in the direction of references that consider such things? Unfortunately, my actual problem involves an infinite volume integral to boot.

1

There are 1 best solutions below

2
On

Yes, this is correct. Now the usual technique is to say the error is bounded by the sum of the absolute values of your terms by the triangle inequality and compute the size of each term. Usually one will dominate over the other. You might well be able to bound $err(x)$ by some constant. Then you can say the integral from $0$ to $1$ is also bounded by that constant. If it is small compared to $O(h^n)$ you can ignore it. To really do this you need the constant in front of $h^n$ for the comparison.