Variance of a mixed variable

220 Views Asked by At

I am studying probability on my own and couldn't find anything on this in my textbooks. Say we have two random variables, $X$ and $Y$, with known means $\mu_{1}$ and $\mu_{2}$, and known variances $\sigma^{2}_{1}$ and $\sigma^{2}_{2}$. $A$ is defined to be a linear combination of both variables. How can you interpret the variance of $A$? And how can you calculate $\textbf{E}(A^{2})$ in order to find such variance?

Does this generalize to more complex functions $A(X,Y)$?

1

There are 1 best solutions below

2
On BEST ANSWER

According to the data provided and the properties of variance, we have \begin{align*} \textbf{Var}(A) = \textbf{Var}(aX+bY) = a^{2}\textbf{Var}(X) + b^{2}\textbf{Var}(Y) = a^{2}\sigma^{2}_{1} + b^{2}\sigma^{2}_{2} \end{align*}

Once $X$ and $Y$ are independent, they are uncorrelated, which means that $\textbf{Cov}(X,Y) = 0$ and, consequently, the formula above is correct.

As you may have noticed, we did not need to use the value $\textbf{E}(A^{2})$ directly in order to obtain the variance: it is enough to make use of the variance properties.

As to the general case, the strategy to solve the problem depends on the expression of $A(X,Y)$.