Consider a positive (semi) definite, symmetric matrix $$\Sigma \in \mathbb{R}^{k\times k}$$ and a matrix $$A \in \mathbb{R}^{k \times m}$$.
Under what conditions is $$B = A^T \Sigma A$$ positive (semi) definite?
I had the following idea: consider any $x$ and decompose it into eigenvectors $v_i$ of $A$, with corresponding eigenvalues $\lambda_i$,i.e. $$ x = \sum_{i} c_i v_i$$
Then we have $$x^T A^T \Sigma A x = \sum_{i} \lambda_i^2 c_i^2 v_i^T \Sigma v_i + \sum_{p \neq q} c_p \lambda_p v_p^T \Sigma c_q \lambda_q v_q.$$
The first part of this sum will be $\geq 0$ by the positive definite-ness of $\Sigma$.
If we choose $v_i$ to be orthogonal w.r.t. the scalar product $\langle x, y \rangle_\Sigma = x^T \Sigma y$, then we are done. Can we always choose them in such a way?
If $A$ is not square, you cannot talk of its eigenvalues / vectors. Even if it is, you cannot assume that the eigenvectors are orthogonal w.r.t. to the scalar product that you give. Think for instance $\Sigma = I$; then you would like the eigenvectors to be orthogonal, which means that $A$ itself is orthogonal.
In general, the easiest way to prove p.s.d.-ness is from the definition: for $x \in \mathbb{R}^m$, compute $$ x^T (A^T \Sigma A) x = (A x)^T \Sigma (Ax), $$ and this is positive this $\Sigma$ is p.s.d.