Looking for a proof of : variance of sum is the sum of variances.

4.1k Views Asked by At

For independent random variables X and Y, the variance of their sum or difference is the sum of their variances:

I can see why above should be true : if $x_1<X<x_1$ and $y_1 <Y < y_2$, then clearly $x_1+y_1 <X+Y<x_2+y_2$. But proving this seems a bit hard. Here is my attempt :

$\mathbb {var(X) = \sum[x_i - mean(X)]^2p_i}$
$\mathbb {var(Y) = \sum[y_i - mean(Y)]^2p_i}$,
then I guess the variance of sum should be :
$\mathbb {var(X+Y) = \sum[(x_i+y_i) - mean(X+Y)]^2\color{red}{p_{??}}}$

There is no way something like (a+b+m)^2 simplifies to (a+m)^2 + (b+m)^2. I'm kinda stuck here, any help ?

3

There are 3 best solutions below

0
On BEST ANSWER

By subtracting off the means, it is sufficient to consider the case when $X$ and $Y$ are centered (i.e., $\mathbb EX = \mathbb EY=0$). Then $$ \text{Var}(X\pm Y)=\mathbb E(X\pm Y)^2=\mathbb E X^2\pm 2\mathbb E(XY)+\mathbb EY^2. $$ Now since $X$ and $Y$ are independent and centered, $\mathbb E(XY)=(\mathbb EX)(\mathbb EY)=0$ and therefore $$ \text{Var}(X\pm Y)=\text{Var}( X)+\text{Var} (Y). $$

0
On

Use the definition to expand and simplify

$Var (X+Y)=\\ \mathbb{E}((X+Y)^2)-(\mu_X+\mu_Y)^2$.

0
On

Using the fact that $V(A) = E(A^2) - [E(A)]^2,$ we have: \begin{align*} V(X+Y) &= E[(X+Y)^2] - E^2(X+Y)\\ &=[E(X^2) + E(Y^2) + 2E(XY) ] - [E^2(X) + E^2(Y) + 2E(X)E(Y)]\\ &=E(X^2) - E^2(X) + E(Y^2) - E^2(Y) + 2E(XY) - 2E(X)E(Y)\\ &= V(X) +V(Y) + 2 \operatorname{cov}(X,Y). \end{align*} When is cov$(X,Y)=0$?