Variance bounded when values of random variable bounded?

69 Views Asked by At

At the risk of asking something dumb, but I'm reading a paper dealing with random variables, say $X$, assumed only to take values in $[0,1]$, have mean $\mu\in[0,1]$ and variance at least $\sigma^2>0$.

It is then mentioned that the variance of $X$ is at most $1$ "because its values are in $[0,1]$."

My question is: is this obvious? Should I see immediately that this is so?

1

There are 1 best solutions below

0
On BEST ANSWER

The Lulu comment is the easiest way to see $X \in [0,1]\implies Var(X)\leq 1$. Here is a proof of the tighter bound $Var(X)\leq 1/4$.

Claim: If $X \in [0,1]$ and $E[X]=m$, then $m \in [0,1]$ and $$0\leq Var(X)\leq m(1-m)$$ The upper bound is achieved by $X \sim Bernoulli(m)$.

Proof: Suppose $E[X]=m$ and $$ 0\leq X \leq 1$$ Taking expectations gives $$0\leq E[X] \leq 1$$ and so $m\in [0,1]$. Define the function $h:[0,1]\rightarrow \mathbb{R}$ by $$ h(x) = (x-m)^2 + (2m-1)(x-m)$$ We see that $h$ is convex and $h(0)=h(1)=m(1-m)$. It follows that $$h(x) \leq m(1-m) \quad \forall x \in [0,1]$$ Since $X \in [0,1]$ we have $$ h(X)\leq m(1-m) $$ Taking expectations of both sides gives $$E[h(X)]\leq m(1-m)$$ But $E[h(X)]=Var(X)$. $\Box$

Maximizing $m(1-m)$ over $m \in [0,1]$ gives the largest possible variance of $Var(X)=1/4$ when $m=1/2$.