Approximations, or bounds for $\mathbb{E}\left[ \|X\|^{-2} \right]$ and $\mathbb{E}\left[\|X\|^{-1} \right]$, $X_i \sim\mathcal{N}(\mu_i, \sigma_i)$

33 Views Asked by At

Let $X\sim\mathcal{N}(\mu, \Sigma)$. There is considerable literature on the distribution and the moments of quadratic forms $Q=X'AX$, where $A$ is a symmetric matrix (or positive definite symmetric). These distributions and moments can be quite complicated. See in particular Mathai & Provost Quadratic forms in random variables.

Some special cases of these quadratic forms are more tractable. For example, when $A=\Sigma=I$, $Q=\|X\|^2$ has a non-central chi-squared distribution, and there's analytic formulas for positive, negative, and fractional moments of this variable (i.e. there's analytic formulas for $\mathbb{E}\left[\frac{1}{\|X\|} \right]$ and $\mathbb{E}\left[\frac{1}{\|X\|^2} \right]$).

I am interested in obtaining $\mathbb{E}\left[\frac{1}{\|X\|} \right]$ and $\mathbb{E}\left[\frac{1}{\|X\|^2} \right]$ for the case where $\Sigma$ is diagonal, i.e. $\Sigma_{ii} = \sigma_i^2$. From the extensive literature on the subject (e.g. this, this), it seems like there is no simple expression for these moments. However, given that the case I'm considering ($A=I$ and $\Sigma$ diagonal) is simpler that the general case typically considered in the literature, I wonder whether there's some way to analytically approximate or bound these negative moments of the norm of the Gaussian vector with independent components $X_i$. In this 1988 paper, for example, there's an expression given in terms of "a convergent doubly infinite series involving an invariant polynomial" for the general case... can that formula be somewhat simplified with an approximation, or for this simpler case?

For some context, this question comes from attempts to approximate other formulas involving quadratic forms of Gaussian vectors (see question here).