Expected value and variance of a set of random variables

167 Views Asked by At

Suppose $X_1, X_2, \ldots , X_n$ are $n$ independent r.v.s, with the same probability distribution and with mean $\mu$ and variance $\sigma^2$. Let $$ \bar{X}=\frac{X_1+X_2+\cdots+X_n}{n} $$ I know the expected value will be $\mu$ and variance will be $\frac{\sigma^2}{n}$, but I'm not sure on how to prove it. Thank you in advance :)

Edit: I'm sorry. I'm aware of how to expand $E(\bar{X})$ and $Var(\bar{X})$ using the formulae. The part that is tripping me up the most is the last step, i.e. why $\frac{1}{n}(E(X_1) + E(X_2) + \cdots + E(X_n))$ can be simplified as E($\bar{X}$), and similarly for variance

3

There are 3 best solutions below

1
On

\begin{align} & \operatorname{var}\left( \frac{X_1+\cdots+X_n} n \right) \\[8pt] = {} & \frac 1 {n^2} \operatorname{var}(X_1+\cdots +X_n) \\[8pt] = {} & \frac 1 {n^2} \left( \operatorname{var}(X_1)+\cdots+\operatorname{var}(X_n) \right) \\[8pt] & \text{and so on.} \\[10pt] \operatorname E\left( \overline X \right) = {} & \operatorname E\left( \frac{X_1+\cdots+X_n} n \right) \\[8pt] = {} & \frac 1 n \left( \operatorname E(X_1+\cdots+X_n) \right) \\[10pt] = {} & \frac 1 n \left( \operatorname E(X_1) + \cdots + \operatorname E(X_n) \right). \end{align}

5
On

\begin{align*} \mathbb E(\bar X)&= \mathbb E\left(\frac{\sum _{i=1}^nX_i}{n}\right)=\frac1n\mathbb E\left(\sum_{i=1}^nX_i\right)=\frac1n\sum_{i=1}^n\mathbb E(X_i)=\frac1n\cdot n\mathbb E(X_i)=\mathbb E(X_i)=\mu \\ \text{Var}(\bar X)&= \text{Var}\left(\frac{\sum_{i=1}^nX_i}{n}\right)=\frac{1}{n^2}\text{Var}\left(\sum_{i=1}^nX_i\right)=\frac1{n^2}\sum_{i=1}^n\text{Var}(X_i)=\frac1{n^2}\cdot n\sigma ^2=\frac{\sigma^2}{n} \end{align*}

Edit: for any random variables $A$ and $B$, $$\mathbb E(A+B)=\mathbb E(A)+\mathbb E(B)$$ If $A$ and $B$ are independent, $$\text{Var}(A+B)=\text{Var}(A)+\text{Var}(B)$$

3
On

For any constant $c$, $E(cX)=cE(X)$. (An explanation at the end.)

So then $$\begin{align} E(\bar{X})=E\left(\frac{X_1+\cdots+X_n}{n}\right) &=E\left(\frac{X_1}{n}+\cdots+\frac{X_n}{n}\right)\\ &=E\left(\frac{X_1}{n}\right)+\cdots+E\left(\frac{X_n}{n}\right)\\ &=\frac{1}{n}E(X_1)+\cdots+\frac{1}{n}E\left(X_n\right)&\text{using }c=\frac1n\\ &=\frac{1}{n}\mu+\cdots+\frac{1}{n}\mu\\ &=\frac{n}{n}\mu=\mu\\ \end{align}$$

And with variance, it's similar. But the constant relation is $\operatorname{Var}(cX)=c^2\operatorname{Var}(X)$. That squaring of $c$ puts $\frac{1}{n^2}$ as the coefficient in the penultimate line, leaving a coefficient $\frac{n}{n^2}$ against the $\sigma^2$.


Why does $E(cX)=cE(X)$? Well, what is the definition of $E$? It should be something like: $$\sum_x x\cdot P(X=x)$$ or an integral version of that: $$\int_{\mathbb{R}} x\cdot p(x)\,dx$$ So $E(cx)$ is $$\sum_x cx\cdot P(X=x)\quad\text{or}\quad\int_{\mathbb{R}} cx\cdot p(x)\,dx$$ and the $c$ can be factored out of the sum or of the integral: $$c\sum_x x\cdot P(X=x)\quad\text{or}\quad c\int_{\mathbb{R}} x\cdot p(x)\,dx$$ $$c\cdot E(x)$$