Why are there two formulas for variance of random variables?

526 Views Asked by At

I'm using an introductory statistics textbook and it mentioned this:

Definition: If $X$ is a random variable with mean $E(X) = \mu$, then the variance of $X$ is defined by $Var(X) = E((X−\mu)^2)$.

I thought the formula for the variance of X was: $$Var(X) = \sum_{i=1}^n p_i \cdot (x_i - μ)^2$$

How come it's different?

1

There are 1 best solutions below

0
On BEST ANSWER

If $X$ is discrete then $E(X)=\sum_{i=1}^n x_i\cdot f(x_i)$. Now let $y_i=(x_i-\mu)^2$. The expected value of $Y$ is $E(Y)=\sum_{i=1}^n (x_i-\mu)^2\cdot f(x_i)$

$$=\sum_{i=1}^n x_i^2\cdot f(x_i)-2\cdot \mu\cdot \underbrace{\sum_{i=1}^n x_i f(x_i)}_{=\mu}+\mu^2 \underbrace{\sum_{i=1}^n f(x_i)}_{=1}$$

$$=\sum_{i=1}^n x_i^2\cdot f(x_i)-\mu^2=Var(X)=E[(X-\mu)^2]$$