When can we say that $E[X^2] - E[X]^2 \geq 0$ for a real-valued random variable $X$?

1.3k Views Asked by At

This a related but separate question to: another question.

I'm trying to prove that the matrix derived, $\Sigma$ is non-negative definite, and I think knowing the question in the title will help.

Background: some lebesgue integration, but not pro at it.

3

There are 3 best solutions below

1
On BEST ANSWER

We could observe that $\operatorname{Var}(X) = E(X^2) - E(X)^2$, and that the variance is always nonnegative. However, since it isn't that hard to show, it might be nice to see why that is.

Recall that $$ \operatorname{Var}(X) := E([X-E(X)]^2) = \int_{\Omega} (X-E(X))^2 \,\mathrm{d}P(x),$$ where $\Omega$ is the sample space and $P$ is the probability measure associated to the random variable $X$. Since $P$ is a nonnegative measure and $(x - E(X))^2 \ge 0$ for all $x\in \Omega$, it follows that $$ \operatorname{Var}(X) \ge 0. \tag{1}$$ We can then exploit the linearity of the integral (i.e. the linearity of expectation) in order to obtain \begin{align} E([X-E(X)]^2) &= E(X^2 - 2XE(X) + E(X)^2) \\ &= E(X^2) - 2E(X)^2 + E(X)^2 \\ &= E(X^2) - E(X)^2. \tag{2} \end{align} Combining (1) and (2), we get the desired result, namely $$ 0 \le \operatorname{Var}(X) = E([X-E(X)]^2) = E(X^2) - E(X)^2. $$ In point of fact, we really only need the second computation, as $E([X-E(X)]^2)$ is the expectation of a nonnegative random variable, and a nonnegative random variable must have nonnegative expectation. However, the connection to the variance is worth emphasizing (I can't tell you how sad it makes me when I have to mark down exam questions because students compute negative values for the variance, then give me imaginary standard deviations).

0
On

The variance of a real random variable $X$ is given as $$V(X)=E(X^2)-(E(X))^2$$ Since $V(X)\geq 0$ always...

0
On

Define the constant $\mu = E[X]$ and random variable $\Delta = X - \mu$, so $X = \mu + \Delta$.

$E[\Delta] = E [X - \mu] = E[X] - \mu = \mu - \mu = 0$.

Remember when $s$ is constant, that $E[A + B] = E[A] + E[B]$, $E[sA] = sE[A]$, and $E[s]=s$ $$\begin{align} E[X^2] - E[X]^2 & = E[(\mu + \Delta)^2] - E[\mu + \Delta]^2\\ & = E[\mu^2 + 2\mu\Delta + \Delta^2] - (E[\mu] + E[\Delta])^2\\ & = E[\mu^2] + E[2 \mu \Delta] + E[\Delta^2] - (E[\mu] + 0)^2\\ & = \mu^2 + 2 \mu E[\Delta] + E[\Delta^2] - (\mu)^2\\ &= 2 \mu 0 + E[\Delta^2]\\ & = E[\Delta^2] \ge 0\\ \end{align}$$