Bound on variance of bounded random variable

2.4k Views Asked by At

For a bounded random variable $X \in [a,b]$, we know $\operatorname{Var}(X) \le (b-a)^2/4$, see for example this answer.

I am trying to give an alternate proof using symmetrization. If $Y$ is an independent copy of $X$, we can rewrite the variance as $\frac{1}{2} \mathbb{E}(X-Y)^2$, where the expectation is over $X$ and $Y$. Another formulation is $\mathbb{E}[(X-Y)^2\mathbf{1}[Y \ge X]]$. However, bounding $(X-Y)^2$ by $(b-a)^2$ gives the looser upper bound $(b-a)^2/2$. Is there a way to tighten this while still using this symmetrization idea, or can this approach not be used to get the optimal $(b-a)^2/4$ bound?

1

There are 1 best solutions below

0
On BEST ANSWER

I think that this approach is flawed. The reason for this is that while you are using an independent copy of $X$, the bound on $|X-Y|$ would be the same if $Y$ was an anti-correlated version of $X$, in other words if $Y = -X$.

In this case $X-Y = 2X$ and the variance of $2X$ is four times that of $X$, not twice. Unless you use, somewhere in the bound, the fact that $Y$ is independent of $X$, I don't see how you could proceed. But note that independence doesn't have any effect on the bound of $X-Y$, $X+Y$, etc. Observations where both $X$ and $Y$ are near the maximum, or $X$ is close to maximum and $Y$ is close to minimum, and so on, are perfectly possible when $X$ and $Y$ are independent. Independence only affects the magnitude of their positive probability.