Why is $Var(V) = a^2\sigma^2_x+b^2\sigma^2_y+c^2\sigma^2_w +2ab\sigma_{xy}+2ac\sigma_{xw}+2bc\sigma_{yw}$?

345 Views Asked by At

Im reviewing some practice questions to study for an exam and I'm stuck on this particular question.

The random variable X has expected value $\mu_x$ and variance $\sigma^2_x<\infty$. The random variable Y has expected value $\mu_y$ and variance $\sigma^2_y<\infty$. The random variable W has expected value $\mu_w$ and variance $\sigma^2_w<\infty$.These random variables are not necessarily independent. That is, the covariances of the pairs of random variables are given by $cov(X,Y) = \sigma_{xy}$, $cov(X,W) = \sigma_{xw}$, and $cov(Y,W)=\sigma_{yw}$. Let $V = aX + bY + cW$.

where I believe that $a,b,c$ are constants (the question doesn't specify, but I'm 99% certain they are based off the answers given to me)

Find $E(V)$ and $Var(V)$

$E(V) = E(aX + bY + cW)$ I can apply the property of linearity with expected values to get $E(aX + bY + cW) = aE(X) + bE(Y) + cE(W) = a\mu_x + b\mu_y + c\mu_w$

which matches the answer given to me.

I struggle with Var(V). The answer is $Var(V) = a^2\sigma^2_x+b^2\sigma^2_y+c^2\sigma^2_w +2ab\sigma_{xy}+2ac\sigma_{xw}+2bc\sigma_{yw}$ But I struggle to get this answer because I'm used to the random variables being independent from one another, but in this case they're not (at least not necessarily). so I tried $Var(V) = E(V^2) - E(V)^2 = E((aX+bY+cW)^2) - E(V)^2 = E(a^2X^2+b^2Y^2+c^2W^2+2aXbY+2aXcW+2bYcW) - (a\mu_x + b\mu_y + c\mu_w)^2$

But this is clearly wrong because this requires the knowledge about the second moment of $X$ and $Y$ and $W$. So what am I doing wrong? Is there an alternative formula I should be using to calculate the answer? I feel like I should be using the covariances given, but I'm not sure how to. Any hints or help would be greatly appreciated.

Thank You!

1

There are 1 best solutions below

0
On

More generally constants $a_i$ and variables $X_i$ of finite variance satisfy$$\begin{align}\operatorname{Var}\sum_ia_iX_i&=E\left(\sum_ia_iX_i\right)^2-\left(E\sum_ia_iX_i\right)^2\\&=\sum_{ij}a_ia_j(EX_iX_j-EX_iEX_j)\\&=\sum_{ij}a_ia_j\operatorname{Cov}(X_i,\,X_j)\\&=\sum_ia_i^2\operatorname{Var}X_i+2\sum_{i>j}a_ia_j\operatorname{Cov}(X_i,\,X_j).\end{align}$$Even more generally but by the same technique,$$\operatorname{Cov}\left(\sum_ia_iX_i,\,\sum_jb_jY_j\right)=\sum_{ij}a_ib_j\operatorname{Cov}(X_i,\,Y_j).$$