Suppose that the variance-covariance matrix of a $p$ -dimensional random vector $X$ is $\Sigma=\sigma_{ij}$ for all $i,j=1, 2, ...,p$. Show that the coefficients of the first principal component have all the same sign, but not for the rest of components.
I was trying it by using the eigenvalues and eigenvectors, but I couldn't conclude anything. Can someone help me with the proof?
In practice this is mostly true because eigenvectors to different eigenvalues need to be orthogonal. So, when the eigenvector to the highest eigenvalue (aka the first principal component) has only nonnegative values the second principal component must have negative and positive values. It is however quite simple to find a covariance matrix where all principal components have positive and negative signs: