Bounds on eigenvalues/trace/singular values of a matrix based on trace/Frobenius norm etc.

170 Views Asked by At

Background: Given an $M\times P$ matrix $B^{(R)}$ with elements $B^{(R)}_{m,p}=f_R(X_1,\dots,X_M)$ and $X_m$ are random variables (for simplicity Gaussian i.i.d) and $f_R$ is a function parameterized by $R$. (Note that $B$ is a finite random matrix but the entries are a function of the random variables and since there are only $M$ of them, some of them will be dependent). Consider the Gram matrix $A^{(R)} = (B^{(R)})^T B^{(R)}$. I can clearly observe numerically that $\operatorname{Tr}((A^{(R_1)})^{-1}) \leq \operatorname{Tr}((A^{(R_2)})^{-1})$. In words, for a given $R$, one matrix structure is better than the other one. This is what I want to proof. It not only holds for the trace but also the minimum eigenvalue of $A^{(R)}$, the determinant and condition number of $(A^{(R)})^{-1}$.

My idea: Without inverting a matrix, calculating its eigenvalues or its determinant, I can calculate any property that's linear, because I can just push the expectation operator in. So it's "easy" to obtain $\mathbb{E}\operatorname{Tr}A^{(R)}$ or $\mathbb{E}\|A^{(R)}\|_F$. But any nonlinear transformation makes it hard to get the expected value. Hence my idea is to bound any of the properties of its inverse for a given $R_1$ and $R_2$ so that I can proof that matrix with $R_1$ is always better than $R_2$.

Question: I want to show that $$ \mathbb{E}g(A^{-1}) \leq \mathbb{E} f(A) , $$

where

  • $f$ is a linear function of $A$ so I can relate to the expected value of its components $A_{p,q}$ (which I know analytically)
  • $g$ is one of the following: $\operatorname{Tr}$, $\operatorname{det}$, $\lambda_{\max}$

Does something like this exist? Note that a valid answer would also be "No, it does not exist" (preferably with an explanation why).