How to use independence to simplify $E\left\{\sum\limits_{i=1}^n (Y_i-\mu ) \right\}^2$

39 Views Asked by At

I don't know how to get the second line from the first line in the following: enter image description here

In the above case, $Y=(y_1, \dots , y_n)^T$ is a random sample from $N(\mu,\sigma^2)$.

My trouble is in simplifying $ E\left(\left\{\sum\limits_{i=1}^n (Y_i-\mu ) \right\}^2\right)$. What I've tried:

$$ \begin{align} E\left(\left\{\sum\limits_{i=1}^n (Y_i-\mu ) \right\}^2\right) & = E\left(\left\{\sum\limits_{i=1}^n (Y_i-\mu ) \right\}\right) E\left(\left\{\sum\limits_{i=1}^n (Y_i-\mu ) \right\}\right) , \text{ by independence} \\ & = \left(\sum\limits_{i=1}^n E \left( Y_i - \mu \right) \right)^2 \text{ by linearity of expectations} \end{align}$$

I don't see how I can get the variance from here on wards. Did I do something wrong?

2

There are 2 best solutions below

3
On BEST ANSWER

For simplicity, write $X_i=Y_i-\mu$, such that $\mathbb{E}[X_i]=0$. Then, $$\begin{align} \mathbb{E}\left[\left(\sum_{i=1}^n X_i\right)^2\right] &= \mathbb{E}\left[\left(\sum_{i=1}^n X_i\right)\left(\sum_{j=1}^n X_j\right)\right] = \mathbb{E}\left[\sum_{i=1}^n \sum_{j=1}^n X_i X_j\right] \\ &= \sum_{1\leq i,j \leq n} \mathbb{E}[X_i X_j] \tag{linearity}\\ &= \sum_{i=1}^n \mathbb{E}[X_i^2] + \sum_{1\leq i\neq j \leq n} \mathbb{E}[X_i]\mathbb{E}[X_j] \qquad\text{(independence when $i\neq j$)} \\ &= \sum_{i=1}^n \mathbb{E}[X_i^2] + 0 = \sum_{i=1}^n \mathbb{E}[X_i^2] = \sum_{i=1}^n \mathrm{Var} Y_i = n\mathrm{Var} Y_1 \end{align}$$

0
On

Multiplying out $$ E\left[\left\{\sum\limits_{i=1}^n (Y_i-\mu ) \right\}^2\right] $$ You get terms like $$ E\left[(Y_i-\mu )^2\right] $$ as shown, but also terms like $$ 2E\left[(Y_i-\mu )(Y_j-\mu )\right] $$ for $i<j$. But the independence shows $$ E\left[(Y_i-\mu )(Y_j-\mu )\right] = E\left[(Y_i-\mu )\right] E\left[(Y_j-\mu )\right] $$ and $\mu$ was chosen so that these are all zero.