Ratios of Gaussian integrals with a positive semidefinite matrix

401 Views Asked by At

Generally speaking, I’m wondering what the usual identities for Gaussian multiple integrals with a positive definite matrix become when the matrix is only positive semidefinite. I could not find anything about this in the literature, any reference is welcome.

For instance, if ${\mathbf{A}}$ is positive definite, then we have

$\mathbb{E}{x_i} = \frac{{\int\limits_{\,{\mathbf{x}}} {{x_i}{e^{ - \frac{1}{2}{{\mathbf{x}}^{\mathbf{T}}}{\mathbf{Ax}} + {{\mathbf{J}}^{\mathbf{T}}}{\mathbf{x}}}}{\text{d}}{\mathbf{x}}} }}{{\int\limits_{\,{\mathbf{x}}} {{e^{ - \frac{1}{2}{{\mathbf{x}}^{\mathbf{T}}}{\mathbf{Ax}} + {{\mathbf{J}}^{\mathbf{T}}}{\mathbf{x}}}}{\text{d}}{\mathbf{x}}} }} = {\left( {{{\mathbf{A}}^{ - 1}}{\mathbf{J}}} \right)_i}$

If ${\mathbf{A}}$ is only positive semidefinite, do we have

$\mathbb{E}{x_i} = \frac{{\int\limits_{\,{\mathbf{x}}} {{x_i}{e^{ - \frac{1}{2}{{\mathbf{x}}^{\mathbf{T}}}{\mathbf{Ax}} + {{\mathbf{J}}^{\mathbf{T}}}{\mathbf{x}}}}{\text{d}}{\mathbf{x}}} }}{{\int\limits_{\,{\mathbf{x}}} {{e^{ - \frac{1}{2}{{\mathbf{x}}^{\mathbf{T}}}{\mathbf{Ax}} + {{\mathbf{J}}^{\mathbf{T}}}{\mathbf{x}}}}{\text{d}}{\mathbf{x}}} }} = {\left( {{{\mathbf{A}}^ + }{\mathbf{J}}} \right)_i}$

where ${{\mathbf{A}}^ + }$ is the Moore-Penrose pseudo-inverse of ${\mathbf{A}}$ ?

P.S. Of course, as pointed out by Gyu Eun below, both integrals become infinite with a psd matrix. But this does not imply that the ratio itself is infinite. The situation looks similar to Feynman path integrals in QM and QFT: we can talk only about ratios of path integrals since both integrals are infinite because they are infinite-dimensional. But the ratio is finite, otherwise path integrals would not exist. Hence my question is: do we have the same kind of infinity cancellation phenomenon with ratios of finite-dimensional Gaussian integrals with a psd matrix as with e.g. infinite-dimensional Gaussian path integrals with a non-singular operator?

The second formula with the pseudo-inverse holds with very high probability, that's an experimental fact. Indeed, when used in applications, it finally gives meaningful and useful results, everything works fine. It is possible that the formula holds only in special cases, including my own. But my own ${\mathbf{A}}$ and ${\mathbf{J}}$ are pretty random, so that the formula is likely to hold without conditions. But proving it under suitable conditions would be great too.

2

There are 2 best solutions below

3
On

If $\mathbf{A}$ is only positive semidefinite, then the corresponding Gaussian integrals are not necessarily defined. In one dimension the only positive semidefinite matrix which is not positive definite is the zero matrix, and $x^t\mathbf{A}x = 0$ for all $x\in\mathbb{R}$, so $$ \int_{-\infty}^\infty e^{-x^t\mathbf{A}x}~dx = \int_{-\infty}^\infty 1~dx = \infty. $$ In higher dimensions the same holds for the same reasons. For instance the Gaussian integral with the matrix $$ \mathbf{A} = \begin{bmatrix} 1 & 0\\ 0 & 0 \end{bmatrix} $$ is infinite because you run into the same one-dimensional integral once you reduce to an iterated integral.

1
On

Here is a very partial answer.

Theorem : if ${\mathbf{A}}$ is positive semidefinite and ${\mathbf{J}} = {\mathbf{A}}{{\mathbf{A}}^ + }{\mathbf{J}}$, then $\mathbb{E}{\mathbf{x}} = {{\mathbf{A}}^ + }{\mathbf{J}}$ , provided we allow ourself to cancel out terms like $\frac{a}{a}$ even if $a$ is infinite.

Proof : recall one of the usual proofs of the identity

$\int\limits_{{\mathbb{R}^n}} {{e^{ - \frac{1}{2}{{\mathbf{x}}^{\mathbf{T}}}{\mathbf{Ax}} + {{\mathbf{J}}^T}{\mathbf{x}}}}{\text{d}}{\mathbf{x}}} = \frac{{{{\left( {2\pi } \right)}^{\frac{n}{2}}}}}{{\sqrt {\left| {\mathbf{A}} \right|} }}{e^{\frac{1}{2}{{\mathbf{J}}^T}{{\mathbf{A}}^{ - 1}}{\mathbf{J}}}}$

for a positive definite matrix ${\mathbf{A}}$ .

Substitute ${\mathbf{x}}$ by ${\mathbf{y}} = {\mathbf{x}} - {{\mathbf{A}}^{ - 1}}{\mathbf{J}}$

${{\text{d}}^n}{\mathbf{x}} = {{\text{d}}^n}{\mathbf{y}}$

$ - \frac{1}{2}{{\mathbf{x}}^T}{\mathbf{Ax}} + {{\mathbf{J}}^T}{\mathbf{x}} = - \frac{1}{2}{\left( {{\mathbf{y}} + {{\mathbf{A}}^{ - 1}}{\mathbf{J}}} \right)^T}{\mathbf{A}}\left( {{\mathbf{y}} + {{\mathbf{A}}^{ - 1}}{\mathbf{J}}} \right) + {{\mathbf{J}}^T}\left( {{\mathbf{y}} + {{\mathbf{A}}^{ - 1}}{\mathbf{J}}} \right) = \\ - \frac{1}{2}\left( {{{\mathbf{J}}^T}{{\mathbf{A}}^{ - 1}}{\mathbf{J}} + {{\mathbf{J}}^T}{\mathbf{y}} + {{\mathbf{y}}^T}{\mathbf{J}} + {{\mathbf{y}}^T}{\mathbf{Ay}}} \right) + {{\mathbf{J}}^T}{{\mathbf{A}}^{ - 1}}{\mathbf{J}} + {{\mathbf{J}}^T}{\mathbf{y}} = \\ \frac{1}{2}{{\mathbf{J}}^T}{{\mathbf{A}}^{ - 1}}{\mathbf{J}} - \frac{1}{2}{{\mathbf{y}}^T}{\mathbf{Ay}} \\ $

Therefore

$\int\limits_{{\mathbb{R}^n}} {{e^{ - \frac{1}{2}{{\mathbf{x}}^T}{\mathbf{Ax}} + {{\mathbf{J}}^T}{\mathbf{x}}}}{\text{d}}{\mathbf{x}}} = {e^{\frac{1}{2}{{\mathbf{J}}^T}{{\mathbf{A}}^{ - 1}}{\mathbf{J}}}}\int\limits_{{\mathbb{R}^n}} {{e^{ - \frac{1}{2}{{\mathbf{y}}^T}{\mathbf{Ay}}}}{\text{d}}{\mathbf{y}}} = \frac{{{{\left( {2\pi } \right)}^{\frac{n}{2}}}}}{{\sqrt {\left| {\mathbf{A}} \right|} }}{e^{\frac{1}{2}{{\mathbf{J}}^T}{{\mathbf{A}}^{ - 1}}{\mathbf{J}}}}$

Then, by the Leibniz rule/Feynman trick

$ \frac{{\partial \int\limits_{{\mathbb{R}^n}} {{e^{ - \frac{1}{2}{{\mathbf{x}}^T}{\mathbf{Ax}} + {{\mathbf{J}}^T}{\mathbf{x}}}}{\text{d}}{\mathbf{x}}} }}{{\partial {{\mathbf{J}}_i}}} = \int\limits_{{\mathbb{R}^n}} {\frac{{\partial {e^{ - \frac{1}{2}{{\mathbf{x}}^T}{\mathbf{Ax}} + {{\mathbf{J}}^T}{\mathbf{x}}}}}}{{\partial {{\mathbf{J}}_i}}}{\text{d}}{\mathbf{x}}} = \int\limits_{{\mathbb{R}^n}} {{x_i}{e^{ - \frac{1}{2}{{\mathbf{x}}^T}{\mathbf{Ax}} + {{\mathbf{J}}^T}{\mathbf{x}}}}{\text{d}}{\mathbf{x}}} = \\ \frac{{{{\left( {2\pi } \right)}^{\frac{n}{2}}}}}{{\sqrt {\left| {\mathbf{A}} \right|} }}\frac{{\partial {e^{\frac{1}{2}{{\mathbf{J}}^T}{{\mathbf{A}}^{ - 1}}{\mathbf{J}}}}}}{{\partial {{\mathbf{J}}_i}}} = \frac{1}{2}\int\limits_{{\mathbb{R}^n}} {{e^{ - \frac{1}{2}{{\mathbf{x}}^T}{\mathbf{Ax}} + {{\mathbf{J}}^T}{\mathbf{x}}}}{\text{d}}{\mathbf{x}}} \frac{\partial }{{\partial {{\mathbf{J}}_i}}}{{\mathbf{J}}^T}{{\mathbf{A}}^{ - 1}}{\mathbf{J}} \\ $

Hence

$\mathbb{E}{x_i} = \frac{{\int\limits_{\,{\mathbf{x}}} {{x_i}{e^{ - \frac{1}{2}{{\mathbf{x}}^T}{\mathbf{Ax}} + {{\mathbf{J}}^T}{\mathbf{x}}}}{\text{d}}{\mathbf{x}}} }}{{\int\limits_{\,{\mathbf{x}}} {{e^{ - \frac{1}{2}{{\mathbf{x}}^T}{\mathbf{Ax}} + {{\mathbf{J}}^T}{\mathbf{x}}}}{\text{d}}{\mathbf{x}}} }} = \frac{1}{2}\frac{\partial }{{\partial {{\mathbf{J}}_i}}}{{\mathbf{J}}^T}{{\mathbf{A}}^{ - 1}}{\mathbf{J}} = {\left( {{{\mathbf{A}}^{ - 1}}{\mathbf{J}}} \right)_i}$

Now, for a positive semidefinite matrix ${\mathbf{A}}$, substitute ${\mathbf{x}}$ by ${\mathbf{y}} = {\mathbf{x}} - {{\mathbf{A}}^ + }{\mathbf{J}}$

$ - \frac{1}{2}{{\mathbf{x}}^T}{\mathbf{Ax}} + {{\mathbf{J}}^T}{\mathbf{x}} = - \frac{1}{2}{\left( {{\mathbf{y}} + {{\mathbf{A}}^ + }{\mathbf{J}}} \right)^T}{\mathbf{A}}\left( {{\mathbf{y}} + {{\mathbf{A}}^ + }{\mathbf{J}}} \right) + {{\mathbf{J}}^T}\left( {{\mathbf{y}} + {{\mathbf{A}}^ + }{\mathbf{J}}} \right) = \\ - \frac{1}{2}\left( {{{\mathbf{J}}^T}\underbrace {{{\mathbf{A}}^ + }{\mathbf{A}}{{\mathbf{A}}^ + }}_{{{\mathbf{A}}^ + }}{\mathbf{J}} + {{\mathbf{J}}^T}{{\mathbf{A}}^ + }{\mathbf{Ay}} + {{\mathbf{y}}^T}{\mathbf{A}}{{\mathbf{A}}^ + }{\mathbf{J}} + {{\mathbf{y}}^T}{\mathbf{Ay}}} \right) + {{\mathbf{J}}^T}{{\mathbf{A}}^ + }{\mathbf{J}} + {{\mathbf{J}}^T}{\mathbf{y}} = \\ \frac{1}{2}{{\mathbf{J}}^T}{{\mathbf{A}}^ + }{\mathbf{J}} - \frac{1}{2}{{\mathbf{y}}^T}{\mathbf{Ay}} + {{\mathbf{J}}^T}\left( {{\mathbf{I}} - {{\mathbf{A}}^ + }{\mathbf{A}}} \right){\mathbf{y}} \\ $

The integral

$\int\limits_{\,{\mathbf{x}}} {{e^{ - \frac{1}{2}{{\mathbf{y}}^T}{\mathbf{Ay}}}}{\text{d}}{\mathbf{y}}} $

now is infinite. But it is not a big deal because it cancels out in the Leibniz rule/Feynman trick above (please tell me).

Therefore, the term ${{\mathbf{J}}^T}\left( {{\mathbf{I}} - {{\mathbf{A}}^ + }{\mathbf{A}}} \right){\mathbf{y}}$ , where ${\mathbf{I}} - {{\mathbf{A}}^ + }{\mathbf{A}}$ is the orthogonal projector on $\ker {\mathbf{A}}$, is the main obstruction against the generalized formula.

So, if ${{\mathbf{J}}^T}\left( {{\mathbf{I}} - {{\mathbf{A}}^ + }{\mathbf{A}}} \right) = 0 \Leftrightarrow {\mathbf{J}} = {\mathbf{A}}{{\mathbf{A}}^ + }{\mathbf{J}}$ then

$\int\limits_{{\mathbb{R}^n}} {{e^{ - \frac{1}{2}{{\mathbf{x}}^T}{\mathbf{Ax}} + {{\mathbf{J}}^T}{\mathbf{x}}}}{\text{d}}{\mathbf{x}}} \propto {e^{\frac{1}{2}{{\mathbf{J}}^T}{{\mathbf{A}}^ + }{\mathbf{J}}}}$

and the generalized formula

$\mathbb{E}{\mathbf{x}} = {{\mathbf{A}}^ + }{\mathbf{J}}$

follows by the Leibniz rule/Feynman trick.

Perhaps this condition is fulfilled with my own ${\mathbf{A}}$ and ${\mathbf{J}}$, I need to check.