When can the covariance matrix $X^TX$ be semidefinite?

177 Views Asked by At

I have a matrix $X$ (of shape $[2, 512]$) which consists of $2$ observations of $512$ dimensional feature vector.

I thought that given that $x_0$ and $x_1$ are linearly independent (they contain a lot of $0$s), it would imply that after centering $X = X - \mu(X)$, we can say that $X^TX$ is at least semi definite.

However, in my experiment, I am getting large negative eigenvalues, which is contradictory. So, my question is under what conditions is $X^TX$ semi definite?

1

There are 1 best solutions below

3
On BEST ANSWER

Always, if your matrix is real. Because if $X^TXv=\lambda v$ with $v$ a unit vector, then $$ \lambda=v^T(\lambda v)=v^TX^TXv=(Xv)^TXv\geq0. $$ If $X$ is not real, it is possible for $X^TX$ be fail to be positive semidefinite; in the complex case, the right notion is the adjoint $X^*$, which is the conjugate tranpose.