I've been trying to derive this on my own, but I'm at an impasse.
Suppose that $v$ is a $m \times 1$ random vector, and $A\in\mathbb{R}^{n\times m}$ is a matrix of constants. I want to show that $\text{Cov}(Av) = A\text{Cov}(v)A^{\top}$. It's easy to show this using the fact that $\text{Cov}(Av)=E((Av-E(Av)(Av-E(Av))^{\top})$. However, it was suggested (or rather told) not to not use this fact. Rather, one ought to use the fact that $Av\in\mathbb{R}^{n\times 1}$, and $$[Av]_i = \sum_{k=1}^m A_{ik}v_k$$ for $i=1,\ldots,n$.
Anyone have suggestions? In particular, where does the $A^{\top}$ appear when considering the sum?
You are meant to prove this componentwise, i.e., show that the $(i,j)$ element of $\operatorname{Cov}(Av)$ equals the $(i,j)$ element of $A\operatorname{Cov}(v)A^\top$. Start with the fact that $$[\operatorname{Cov}(v)]_{ij}=\operatorname{Cov}(v_i, v_j),\tag1$$ where the RHS of (1) is the ordinary covariance of scalar random variables, to write $$\left[\operatorname{Cov}(Av)\right]_{ij}=\operatorname{Cov}\left((Av)_i, (Av)_j\right).\tag2$$ Then use your Fact to expand this as $$\operatorname{Cov}\left(\sum_kA_{ik}v_k, \sum_lA_{jl}v_l\right).\tag3$$ Then use bilinearity of covariance to write this last as $$\sum_k\sum_l A_{ik}A_{jl}\operatorname{Cov}(v_k,v_l).\tag4$$ At the end of all this you want to arrive at $$\left[A\operatorname{Cov}(v)A^\top\right]_{ij}.\tag5$$ Can you see how to get there?