Let us consider we have a real, symmetric matrix $W \in \mathcal{R}^{n \times n}$.
Is there a linear condition (linear equation or linear inequality in terms of the matrix elements ($w_{11}, w_{12} \, ... w_{nn} $)) that is necessary and sufficient to guarantee that the matrix $W$ will be semi-definite?
Essentially, I am looking for some theorem that is linear in terms of the matrix elements (therefore, something like the determinant will not work since it is nonlinear in terms of the elements of $W$) that lets me check if $W$ (real, symmetric) is a covariance matrix. Does such a condition even exist?
I found a theorem that comes very close to meeting all my requirements. "Diagonally dominant " real symmetric matrices are always postive-semidefinite. The diagonal dominance requires, for every row, the sum of magnitudes of all off-diagonal elements be less than the diagonal element. This is a set of $n$ linear inequalities.
But all covariance matrices are not diagonally dominant. So this does not cover the space of all covariance matrices.
I appreciate the help!
No such linear inequalities exist, because their existence would wrongly imply that $$ \left\{(x,y)\in\mathbb R^2:\ 0\le x\le1,\ \pmatrix{x&y\\ y&1}\succeq0\right\} =\left\{y^2\le x\le1\right\} $$ is a polytope.