A bound for the expectation of a quadratic form of a normal random vector

172 Views Asked by At

Let $X\sim\mathcal{N}(m, R)$ be a $n$-dimensional multivariate Gaussian random variable and $A\in\mathbb{R}^{n\times n}$ a (deterministic) symmetric positive definite matrix. Assume that the diagonal elements of $R$ are of the form $r_{ii} = \eta^2 m_i^2$ (that is, the relative variance $r_{ii}/m_i^2$ is the same for each element). Does the bound $$\mathbf{E}[X^TRX] \geq (1+\eta^2) m^TAm$$ hold in general? This is trivially true in the case where $A$ is a diagonal matrix, but I am not able to generalize this. I have figured out that $$\mathbf{E}[X^TRX] = m^TAm + \sum_{i,j} a_{ij} r_{ij} = m^TAm + \mathbf{trace}(AR^T)$$ but I can't get anywhere from there. If this does not hold in general, I'd be happy to make some assumptions on $A$ and/or $R$. One such assumption would be to extend the assumptions I am making about the diagonal elements of $R$ to the off-diagonal in the sense that $r_{ij} = \eta^2 m_i m_j$, but that is not justified in my case. Does anybody have any other ideas?

Edit: I'd also be happy with a bound $$\mathbf{E}[X^TRX] \geq c (1+\eta^2) m^TAm$$ with a constant $c>0$. One such bound that I was able to derive is $c=\lambda_\text{min}/\lambda_\text{max}$ with the smallest and largest eigenvalues of $A$, but maybe we can do better?