Using Monte Carlo to evaluate the integral of a bounded function on a bounded interval is straightforward.
I have the following integral that I want to evaluate using Monte Carlo:
$$I = \int_{0}^{\pi/2} \ln (\sin(x)) dx$$ with $$f(x) = \ln (\sin(x))$$ It is worth noting that $$ \lim_{x\to 0} f(x) = -\infty $$
What are the theoretical foundations that I need to take into consideration, and how to proceed with Monte Carlo in this case?
No matter if Monte Carlo or other numeric "approximators" to evaluate an integral with discontinuity at the borders, I'd suggest to use a non-uniform distribution of the samples to evaluate $f(x)$.
Two steps --
i) Change of limits:
let $\displaystyle x = \frac{b-a}{2}\cdot t + \frac{b+a}{2}$, then $\displaystyle dx = \frac{b-a}{2}dt$, and $\displaystyle I = \frac{b-a}{2}\int_{-1}^{1} f\left (\frac{b-a}{2}t+\frac{b+a}{2}\right ) dt$.
ii) Have non-uniform sample points:
let $t = 3/2\cdot u - 1/2\cdot u^3$, then $dt = \frac{3}{2}(1-u^2)du$, and
$\displaystyle I=\frac{3(b-a)}{4}\int_{-1}^{1} f\left (\frac{(b-a)}{4}u(3-u^2)+\frac{b+a}{2}\right )(1-u^2)du$
Now uniform distributed samples in $u$ over $[-1, 1]$ will get closer to the borders $a$ and $b$ in $x$ while $f(x)$ will not be sampled at $a$ or $b$ unless a rounding error slip due to used system/platform/calculator.
This idea is copied from PPC ROM USR MNL, chapter "IG" (INTEGRATE), p. 220ff.