Determining variance from sum of two random correlated variables

153.1k Views Asked by At

I understand that the variance of the sum of two independent normally distributed random variables is the sum of the variances, but how does this change when the two random variables are correlated?

2

There are 2 best solutions below

8
On BEST ANSWER

For any two random variables: $$\text{Var}(X+Y) =\text{Var}(X)+\text{Var}(Y)+2\text{Cov}(X,Y).$$ If the variables are uncorrelated (that is, $\text{Cov}(X,Y)=0$), then

$$\tag{1}\text{Var}(X+Y) =\text{Var}(X)+\text{Var}(Y).$$ In particular, if $X$ and $Y$ are independent, then equation $(1)$ holds.

In general $$ \text{Var}\Bigl(\,\sum_{i=1}^n X_i\,\Bigr)= \sum_{i=1}^n\text{Var}( X_i)+ 2\sum_{i< j} \text{Cov}(X_i,X_j). $$ If for each $i\ne j$, $X_i$ and $X_j$ are uncorrelated, in particular if the $X_i$ are pairwise independent (that is, $X_i$ and $X_j$ are independent whenever $i\ne j$), then $$ \text{Var}\Bigl(\,\sum_{i=1}^n X_i\,\Bigr)= \sum_{i=1}^n\text{Var}( X_i) . $$

0
On

You can also think in vector form:

$$\text{Var}(a^T X) = a^T \text{Var}(X) a$$

where $a$ could be a vector or a matrix, $X = (X_1, X_2, \dots, X_n)^T$ is a vector of random variables. $\text{Var}(X)$ is the covariance matrix.

If $a = (1, 1, \dots, 1)^T$, then $a^T X$ is the sum of all the $x_i's$.