how to calculate the error for nested numerical methods

34 Views Asked by At

I want to calculate some integral $\int_a^b q(p)dp$ where q(p) is a quantile of a probability distribution. These quantiles are approximations with a certain error $\epsilon_q$ and the integral is approximated with newton-cotes formulas that have a error of $\epsilon_i$

What is the total error?

1

There are 1 best solutions below

0
On BEST ANSWER

In normal cases, when the errors are small, we assume they add linearly. Your quantile error might give $\int _a^b \epsilon_q dp$, which you can bound with $(b-a)\epsilon_q$. You might be able to argue that $\epsilon_q$ is sometimes positive, sometimes negative, and the integral will be rather less. If you use $n$ intervals the first expression would be $n\epsilon_q$ and you can sometimes argue for $\sqrt n \epsilon_q$ instead as a random walk. Finally add in $\epsion_i$.

This is not guaranteed. It is possible for an error in one quantity to take you close to a singularity in another and make the overall error much larger.