Suppose we have (X, Y ) be a bivariate normal vector with mean vector µ = (0, 0) and covariance matrix $$ Σ= \begin{bmatrix}2 & 1 \\1 & 2 \\\end{bmatrix} $$
I'm given: $$ A^TΣA=\begin{bmatrix}3 & 0 \\0 & 1 \\\end{bmatrix} $$ With A$^T$A=I.
Let U = u$_1$X + u$_2$Y and V = v$_1$X + v$_2$Y, when A=\begin{bmatrix}u1 & v1 \\u2 & v2 \\\end{bmatrix} then show U and V are independent. First I found A=\begin{bmatrix}1/sqrt{2} & 1/sqrt{2} \\1/sqrt{2} & -1/sqrt{2} \\\end{bmatrix} then I found that E(U)=E(V)=0, Var(U)=2, Var(V)=0 and Cov(U,V)=2 by using various axioms for variance and expectation along with the use of my matrix Σ to give Var(X)=Var(Y)=2 and Cov(X,Y)=1. However from here I'm not too sure where to go, I have tried by finding a joint probability mass function f$_x$$_,$$_y$(x,y) however I'm not too sure how it applies to f$_U$$_,$$_V$(u,v). Does anybody know how I go about finding U and V to be independent. Any help is greatly appreciated.
The covariance matrix of $A\begin{bmatrix}X\\Y\end{bmatrix}$ is $$\text{Cov}\left(A\begin{bmatrix}X\\Y\end{bmatrix} \right) = A \text{Cov}\left(A\begin{bmatrix}X\\Y\end{bmatrix} \right) A^\top = A\Sigma A^\top,$$ so we automatically obtain $\text{Var}(U)=3$, $\text{Var}(V)=1$, and $\text{Cov}(U,V)=0$.
If you want to do it by hand, \begin{align} \text{Var}(U) = \text{Var}(u_1 X + u_2 Y) &= \frac{1}{2} \text{Var}(X) + \frac{1}{2} \text{Var}(Y) + 2 \cdot \frac{1}{2} \text{Cov}(X, Y) = 3 \\ \text{Var}(V) = \text{Var}(v_1 X + v_2 Y) &= \frac{1}{2} \text{Var}(X) + \frac{1}{2} \text{Var}(Y) - 2 \cdot \frac{1}{2}\text{Cov}(X, Y) = 1 \\ \text{Cov}(U, V) = \text{Cov}(u_1 X + u_2 Y, v_1 X + v_2 Y) &= \frac{1}{2} \text{Var}(X) - \frac{1}{2} \text{Var}(Y) - \frac{1}{2} \text{Cov}(X, Y) + \frac{1}{2} \text{Cov}(Y, X) = 0. \end{align}