Let x be Gaussian with covariance $\Sigma_x$, and A an matrix with all the columns being unit vectors.
Then how can I prove that if I want to maximize det($A\Sigma_xA^T$), I can just set the columns of A to be the eigenvectors of X?
Let x be Gaussian with covariance $\Sigma_x$, and A an matrix with all the columns being unit vectors.
Then how can I prove that if I want to maximize det($A\Sigma_xA^T$), I can just set the columns of A to be the eigenvectors of X?
$\det(A\Sigma A^T)=\det(A)^2\det(\Sigma)$, so you just want to maximize $\det(A)$. Can you see how to conclude?
Try proving the following lemma.
Lemma. Let $A$ be a matrix whose columns have norm $\leq1$. Then $\det(A)\leq1$, with equality if and only if $A$ is orthogonal.
Hint: Gram-Schmidt...