Does a Zero Variance imply a constant random variable?

10.1k Views Asked by At

I managed to show that if a random variable is constant : $P(X = \mu ) = 1$ then the Variance is zero:

$E[X^2] = \sum_{i = 1}^{k} p_iX_i^2 =>$ (given X_i is constant)$=> X^2\sum_{i=1}^kp_i = X^2 $

$E[X]^2 = (\sum_{i=1}^{k}p_iX_i)^2 = (X\sum_{i=1}^{k}p_i)^2 = X^2$

This is relatively straight forward with the for on the variance of $E[X^2]-E[X]^2$

However is the converse true? Does a zero variance necessarily imply a constant random variable?

1

There are 1 best solutions below

3
On BEST ANSWER

A non-negative random variable with a zero expected value is almost surely equal to $0$. This comes from the properties of the integral.

If you apply this to the random variable $(X-E(X))^2$, which is non-negative, assuming that $V(X)=E\left[(X-E(X))^2\right]=0$ implies that $(X-E(X))^2=0$ a.s., which means that $X=E(X)$ a.s.