Variance of Sum of Independent random

404 Views Asked by At

I understood the fact the sum of mean of independent random variables is the sum of individual random variables expectation.

E(X+Y) = E(X) + E(Y)

Var(X+Y) = Var(X) + Var(Y)

In this [link]: https://math.stackexchange.com/a/606239/369172/

They have mentioned that the Expectation of cross terms equates to zero

i.e E(x - mean(x)) = 0

Can anyone give an intuitive explanation for this?

Thank you!

4

There are 4 best solutions below

0
On BEST ANSWER

An intuitive explanation would be that if you have a set of values $\{x_1,...,x_n\}$ and from each value you subtract the average $\bar{x}=\frac{x_1,...x_n}{n}$, then the average of the resulting set $\{x_1-\bar{x},...,x_n-\bar{x}\}$ will be 0. Expectation is nothing else than a weighted average.

0
On

Consider a constant $c$, then $E[c]=c$, and $E[X-c]=E[X]-E[c]=E[X]-c$.

If $E[X]$ is finite, then it is a constant, so one may substitute in $c=E[X]$.

0
On

For $$\mathsf{E}(X+Y)=\mathsf{E}(X)+\mathsf{E}(Y)$$ to hold, independence is not required. It is valid for arbitrary random variables $X$ and $Y$. This is due to the linearity of $\mathsf{E}$ operator.

On the other hand, since in general $$\mathsf{Var}(X+Y)=\mathsf{Var}(X)+\mathsf{Var}(Y)+2\mathsf{Cov}(X,Y),$$ in order to have $$\mathsf{Var}(X+Y)=\mathsf{Var}(X)+\mathsf{Var}(Y)$$

we should have $\mathsf{Cov}(X,Y)=0$ which means $X$ and $Y$ should be uncorrelated. Since $\mathsf{Cov}(X,Y)=\mathsf{E}(XY)-\mathsf{E}(X)\mathsf{E}(Y)$, it is equivalent to have $\mathsf{E}(XY)=\mathsf{E}(X)\mathsf{E}(Y)$ which is valid under independence of $X$ and $Y$ as well.

When the mean of $X$ or $Y$ is zero ($\mathsf{E}(X)=0$ or $\mathsf{E}(Y)=0$), then $\mathsf{E}(XY)= \mathsf{E}(X)\times 0=0$ or $=0\times\mathsf{E}(Y)=0$. This is what happens when you subtract the mean!

0
On

As you know, $\mathsf {Var}(Z)=\mathsf E(Z^2)-\mathsf E(Z)^2$.

Substituting $Z\gets X+Y$ , then expanding and rearranging using the Linearity of Expectation, and such, we call the "cross terms" mentioned above as the "covariance".

$$\begin{align}\mathsf {Var}(X+Y) ~&=~ \mathsf E((X+Y)^2)-(\mathsf E(X+Y))^2 \\[1ex] &=~ \mathsf E((X+Y)^2)-(\mathsf E(X)+\mathsf E(Y))^2 \\[1ex] &=~ \mathsf E(X^2+2XY+Y^2) - \mathsf E(X)^2-2\mathsf E(X)\mathsf E(Y)-\mathsf E(Y)^2 \\[1ex] &=~ \mathsf E(X^2)+2\mathsf E(XY)+\mathsf E(Y^2) - \mathsf E(X)^2-2\mathsf E(X)\mathsf E(Y)-\mathsf E(Y)^2 \\[1ex] &=~ \mathsf E(X^2)-\mathsf E(X)^2+\mathsf E(Y^2)-\mathsf E(Y)^2+\underbrace{2~(\mathsf E(XY)-\mathsf E(X)\mathsf E(Y))}_{\text{the }``\text{cross terms}''} \\[1ex] &=~ \mathsf {Var}(X)+\mathsf {Var}(Y)+2~\mathsf{Cov}(X,Y) \end{align}$$

Now, should these random variables be independent, it can be show that $\mathsf E(XY)=\mathsf E(X)\;\mathsf E(Y)$ so the variables are uncorrelated.   (NB: The converse is not necessarily so.)