How can I estimate error when computing improper integral with a finite interval?

72 Views Asked by At

I'm trying to evaluate integrals of the form $$I = \int ^{\infty }_{a} f( x) dx$$ Where $a$ is any real number. In order to estimate this integral, I pick some large positive number $M$ and instead evaluate $$I_{\text{approx}} = \int ^{M}_{a} f( x) dx.$$ I am using Simpson's Rule to estimate the integrals. So given the large number $M$ and the step size used for Simpson's Rule, $\Delta x$, how do I estimate $\displaystyle |I\ -\ I_{\text{approx}} |$?