Suppose that the variance-covariance matrix of a $p$-dimensional random vector $X$ is

113 Views Asked by At

Suppose that the variance-covariance matrix of a $p$ -dimensional random vector $X$ is $\Sigma=\sigma_{ij}$ for all $i,j=1, 2, ...,p$. Show that the coefficients of the first principal component have all the same sign, but not for the rest of components.

I was trying it by using the eigenvalues and eigenvectors, but I couldn't conclude anything. Can someone help me with the proof?

1

There are 1 best solutions below

0
On

In practice this is mostly true because eigenvectors to different eigenvalues need to be orthogonal. So, when the eigenvector to the highest eigenvalue (aka the first principal component) has only nonnegative values the second principal component must have negative and positive values. It is however quite simple to find a covariance matrix where all principal components have positive and negative signs:

Start rotating the matrix $$\begin{pmatrix}-1&0&0\\0&-1&0\\0&0&-1\end{pmatrix}$$ around the three axes by, say, $\pi/20.$ This leads to an orthonormal eigenvector matrix $$S\approx\begin{pmatrix}-0.98 &0.13&-0.18\\-0.15&-0.98&0.13\\0.16&-0.15&-0.98\end{pmatrix}$$ to which you can apply your favourite positive definite diagonal matrix $D$ to get a covariance matrix $C=SDS^\top$ having the columns of $S$ as principal components.