Oppenheim's inequality is a standard result about the Hadamard product of positive definite matrices. It goes as follows, let $A=(a_{ij})_{i,j\leq n},B=(b_{ij})_{i,j\leq n} \in S_n^{++}$ where $S_n^{++}$ denotes the cone of positive definite matrices of size $n$, and let $\odot$ denote the Hadamard product. Then $$\det (A \odot B) \geq a_{11}...a_{nn} \det (B), $$ with equality if and only if $B$ is diagonal.
Most proofs (e.g. Markham's in 1986) use induction. I thought I found a neat probabilistic proof based on Jensen's inequality. But it does not quite work, since we only prove the inequality up to a constant factor. Does anyone see a way to fix the proof?
A proof that almost works
The idea is to use the fact that, if $X \sim \mathcal{N}(0,A)$, and $Y \sim \mathcal{N}(0,B)$ are independent, then $$ A \odot B = \text{Cov}(X \odot Y) = \mathbb{E} [ (X\odot Y) (X\odot Y)^T]. $$ An useful way to see Hadamard products between vectors is to note that $X\odot Y = D_X Y$, where $D_X$ is the diagonal matrix with $X$ on the diagonal. Using this, $$ \mathbb{E} [ (X\odot Y)(X\odot Y)^T] = \mathbb{E} [ D_X Y Y^T D_X] = \mathbb{E}[ \mathbb{E}[D_X Y Y^T D_X | X ]] = \mathbb{E}[D_XBD_X], $$ using the law of total expectation.
Now, we can just use Jensen's inequality together with the concavity of the log-determinant: \begin{align} \log (\det A \odot B) = \log \det \mathbb{E}[D_XBD_X] \geq \mathbb{E}[ \log \det (D_XBD_X )] &= \log \det B + \mathbb{E}[ \log \det(D_X^2)] \\ &= \log \det B + \sum_{i=1}^n\mathbb{E}[\log x_i^2]. \end{align} Here we would really like to use that $\mathbb{E}[x_i^2] = a_{ii}$ and conclude the proof, but Jensen's inequality is the other way around this time :(
We still have proven something, since, because $X$ is Gaussian, for each $i$, $x_i^2/a_{ii}$ will follow a $\chi^2$ distribution with a single degree of freedom, and therefore $$\mathbb{E}[\log x_i^2] = \mathbb{E}[\log x_i^2/a_{ii}] + \log a_{ii} = \psi(1/2)+\log(2) + \log a_{ii} = -\log(2) - \gamma + \log a_{ii}, $$ using the properties of the $\chi^2$ distribution ($\psi$ is the digamma function, and $\gamma$ is Euler's constant).
Putting all the pieces together, we have then proven that $$\log (\det A \odot B) \geq \log \det B + \sum_{i=1}^n \log a_{ii} - n(\gamma + \log 2) \approx \log \det B + \sum_{i=1}^n \log a_{ii} - 1.27n. $$
Oppenheim's inequality is exactly this, but, without this annoying $1.27n$ additional term! My question is then: is there a way to make a version of this proof with a tight bound?
Reference
Markham, Oppenheim's Inequality for Positive Definite Matrices, The American Mathematical Monthly, Vol. 93, No. 8, pp. 642-644, 1986