I was hoping someone could provide some intuition to some proofs of vector variances. The expectation case seems rather simple but I get confused when trying to work it out for the variance.
In the case of $$Var_x(a^T x) = \int \sum_i a_i (x_i - \bar x_i) \sum_j a_j(x_j - \bar x_j)p(x_i, x_j) d(\bar x)$$
$$= \sum_i \sum_j a_i a_j\int (x_i - \bar x_i)(x_j - \bar x_j)p(x_i, x_j) d(\bar x)$$
The result is $$a^TV_x(x)a$$ butI do not quite see how this works?
Similarly for the matrix for $$Var_x(Ax)$$ I don't know how to get to the result $$A \Sigma A^T$$
I would really appreciate if someone could explain how to see the transposes, specifically for index case as that is the method we are following in our class.
Thanks
You can write $Ax = \sum_i a_i x_i$, where $a_i$ are columns of $A$ and $x_i$ is the $i^{th}$ component of $x$. This means $Var(Ax) = Var(\sum_i a_i x_i) = \sum_i \sum_j a_ia_jcov(x_i, x_j) = A\Sigma A^T$, where $\Sigma$ is the variance-covariance matrix of $x$.