Comparing mean and variance of Monte Carlo estimators.

95 Views Asked by At

We are interested in estimating $J=\int_0^1 g(x)\ \mathsf dx$, where $0\leqslant g(x)\leqslant 1$ for all $x$. Let $X$ and $Y$ be independent random variables uniformly distributed over $(0,1)$. Let $U = \mathsf 1_{\{Y\leqslant g(X)\}}$, $V=g(X)$, and $W = \frac12(g(X)+g(1-X))$. Show that $\mathbb E[U]=\mathbb E[V] = \mathbb E[W]=J$ and that $\mathrm{Var}(W)\leqslant \mathrm{var}(V)\leqslant \mathrm{Var}(U)$.

I see that \begin{align} \mathbb E[U] &= \mathbb P(Y\leqslant g(X)) = \int_0^1\int_0^{g(X)}\ \mathsf dy\ \mathsf dx = \int_0^1 g(x)\ \mathsf dx,\\ \mathbb E[V] &= \mathbb E[g(X)] = \int_0^1 g(x),\\ \mathbb E[W] &= \frac12\int_0^1 g(x)\ \mathsf dx +\frac12 \int_0^1 g(1-x)\ \mathsf dx = \int_0^1 g(x), \end{align} but I am not sure about the variances. Since $U^2=U$, we have $\mathbb E[U^2]=\mathbb E[U]$ and hence $$\mathrm{Var}(U) = \int_0^1 g(x)\ \mathsf dx - \left(\int_0^1 g(x)\ \mathsf dx\right)^2.$$ Since $V^2 = g(X)^2$ we have $\mathbb E[V^2] = \int_0^1 g(x)^2\ \mathsf dx$ and hence $$ \mathrm{Var}(V) = \int_0^1 g(x)\ \mathsf dx - \int_0^1 g(x)^2\ \mathsf dx = \int_0^1 g(x)(1-g(x))\ \mathsf dx. $$ For $W$ I get a really nasty expression for $\mathbb E[W^2]$: \begin{align} \mathbb E[W^2] &= \frac14\mathbb E[(g(X)+g(1-X))^2]\\ &= \frac14\left(\int_0^1g(x)^2\ \mathsf dx + 2\int_0^1 g(x)g(1-x)\ \mathsf dx + \int_0^1 g(1-x)^2\ \mathsf dx \right). \end{align} I don't see a clean form for $\mathrm{Var}(W)$ nor how to compare the variances. Am I on the right track, or did I make an error somewhere?

2

There are 2 best solutions below

1
On BEST ANSWER

Your expression of $W$ must be wrong, as noted by @kimchilover . If $g(x)$ is a constant then $W=0$ , hence it cannot happen that $E[W]=J$. You surely meant $W=\frac{1}{2}(g(X)+g(1-X))$

Because the three variables have the same mean, to compare the variances it's equivalent to compare the squared means.

Because $0\le g(x) \le 1$ we have $g(x)^2 \le g(x)$ and $$E[V^2] = \int_0^1 g^2(x) dx \le \int_0^1 g(x) dx =E[U^2]$$

For the other, note that in general, if $X$ and $Y$ are two non-negative random variables with $E[X^2]=E[Y^2]=m$, we have (Cauchy–Schwarz inequality):

$$E[(X+Y)^2] = 2E[X^2] + 2 E[XY] \le 4 E[X^2] $$

Using this for $g(X)$ and $g(1-X)$ we get

$$E[W^2]=\frac{1}{4}E[(g(X)+g(1-X))^2] \le E[g(X)^2] = E[V^2]$$

0
On

Correct $W=\frac{1}{2}(g(X)+g(1-X))$

Since, as you noted, the means are all equal, it is sufficient to show $E(U^2)\ge E(V^2)\ge E(W^2)$.

$E(U^2)=\int_0^1g(x)dx$, $E(V^2)=\int_0^1g^2(x)dx$.
Because $0\le g(x)\le 1$, $g^2(x)\le g(x)$, thus $E(V^2)\le E(U^2)$

$E(W^2)=\frac{1}{4}(\int_0^1g^2(x)dx+2\int_0^1g(x)g(1-x)dx+\int_0^1g^2(1-x)dx)$. $\int_0^1g^2(1-x)dx=\int_0^1g^2(x)dx$

Using Cauchy-Schwarz inequality, $\int_0^1g(x)g(1-x)dx\le (\int_0^1g^2(x)dx\int_0^1g^2(1-x)dx)^{\frac{1}{2}}=\int_0^1g^2(x)dx$.

Therefore $E(W^2)\le \int_0^1g^2(x)dx=E(V^2)$