What is the relation between off-diagonal entries and the determinant of a SPSD matrix?

60 Views Asked by At

I am interested in the differential entropy of a multivariate Gaussian distribution, $h(f)$. In particular, I want to know how does $h(f)$ change with changes in the off-diagonal elements of the covariance matrix, $\Sigma$. As a reminder, $h(f)\propto \frac{1}{2}\text{det}(\Sigma)$.

We know $\Sigma$ is symmetric positive semidefinite, so that $\text{det}(\Sigma)\geq 0$.

For a $2\times 2$ matrix, $ \begin{vmatrix} a&c\\ c&d \end{vmatrix}= ad-c^2 $. Implying that the lower the covariance between the two variables, the higher the entropy of the distribution (as we would expect, because if the variables have higher covariance, the distribution is "less random").

For a $3\times 3$ matrix the conclusion is less clear: $ \begin{vmatrix} a&d&e\\ d&b&f\\ e&f&c\\ \end{vmatrix}= a(bc-f^2)-d(dc-fe)+e(df-be) $.

Can we say anything about the relation between the off-diagonal elements of a symmetric positive semidefinite matrix and its determinant?