We are interested in estimating $J=\int_0^1 g(x)\ \mathsf dx$, where $0\leqslant g(x)\leqslant 1$ for all $x$. Let $X$ and $Y$ be independent random variables uniformly distributed over $(0,1)$. Let $U = \mathsf 1_{\{Y\leqslant g(X)\}}$, $V=g(X)$, and $W = \frac12(g(X)+g(1-X))$. Show that $\mathbb E[U]=\mathbb E[V] = \mathbb E[W]=J$ and that $\mathrm{Var}(W)\leqslant \mathrm{var}(V)\leqslant \mathrm{Var}(U)$.
I see that \begin{align} \mathbb E[U] &= \mathbb P(Y\leqslant g(X)) = \int_0^1\int_0^{g(X)}\ \mathsf dy\ \mathsf dx = \int_0^1 g(x)\ \mathsf dx,\\ \mathbb E[V] &= \mathbb E[g(X)] = \int_0^1 g(x),\\ \mathbb E[W] &= \frac12\int_0^1 g(x)\ \mathsf dx +\frac12 \int_0^1 g(1-x)\ \mathsf dx = \int_0^1 g(x), \end{align} but I am not sure about the variances. Since $U^2=U$, we have $\mathbb E[U^2]=\mathbb E[U]$ and hence $$\mathrm{Var}(U) = \int_0^1 g(x)\ \mathsf dx - \left(\int_0^1 g(x)\ \mathsf dx\right)^2.$$ Since $V^2 = g(X)^2$ we have $\mathbb E[V^2] = \int_0^1 g(x)^2\ \mathsf dx$ and hence $$ \mathrm{Var}(V) = \int_0^1 g(x)\ \mathsf dx - \int_0^1 g(x)^2\ \mathsf dx = \int_0^1 g(x)(1-g(x))\ \mathsf dx. $$ For $W$ I get a really nasty expression for $\mathbb E[W^2]$: \begin{align} \mathbb E[W^2] &= \frac14\mathbb E[(g(X)+g(1-X))^2]\\ &= \frac14\left(\int_0^1g(x)^2\ \mathsf dx + 2\int_0^1 g(x)g(1-x)\ \mathsf dx + \int_0^1 g(1-x)^2\ \mathsf dx \right). \end{align} I don't see a clean form for $\mathrm{Var}(W)$ nor how to compare the variances. Am I on the right track, or did I make an error somewhere?
Your expression of $W$ must be wrong, as noted by @kimchilover . If $g(x)$ is a constant then $W=0$ , hence it cannot happen that $E[W]=J$. You surely meant $W=\frac{1}{2}(g(X)+g(1-X))$
Because the three variables have the same mean, to compare the variances it's equivalent to compare the squared means.
Because $0\le g(x) \le 1$ we have $g(x)^2 \le g(x)$ and $$E[V^2] = \int_0^1 g^2(x) dx \le \int_0^1 g(x) dx =E[U^2]$$
For the other, note that in general, if $X$ and $Y$ are two non-negative random variables with $E[X^2]=E[Y^2]=m$, we have (Cauchy–Schwarz inequality):
$$E[(X+Y)^2] = 2E[X^2] + 2 E[XY] \le 4 E[X^2] $$
Using this for $g(X)$ and $g(1-X)$ we get
$$E[W^2]=\frac{1}{4}E[(g(X)+g(1-X))^2] \le E[g(X)^2] = E[V^2]$$