I am trying to numerically (in Julia) verify that
A symmetric matrix $\mathbf{A}$ is positive semidefinite if and only if it is a covariance matrix.
Then I need to verify in both directions, i.e.
- Given a positive semidefinite matrix $\mathbf{A}$, show that it is a covariance matrix.
- Given a covariance matrix, show that it is positive semidefinite.
However, I am not sure
- What properties should a matrix have to be a covariance matrix.
- I know I could generate a covariance matrix using the following and I know that
covis positive semidefinite if and only if all of its eigenvalues are non-negative. But it turns out thatminimum(eigvals(cov))is a negative number close to 0 (on the order $\sim 10^{-15}$), I am not sure if I could conclude thatcovis positive semidefinite since numerical reasons.
n = 100
u = randn(n);
cov = u * u'
Any input will be appreciated.
Maybe it's easier to verify that a covariance matrix is a Gram matrix (and vice versa) and to verify that a p.s.d. matrix is a Gram matrix (and vice versa). The numerical linear algebra step could then be the Cholesky decomposition.
But with any demonstration of this result on a computer whose computational model is some flavor of floating point arithmetic and not $\mathbb R$ arithmetic will suffer from roundoff error at some place or other.