The Gaussian Integral

532 Views Asked by At

Hi I am trying to calculate the expected value of $$ \mathbb{E}\big[x_i x_j...x_N\big]=\int_{-\infty}^\infty x_ix_jx_k...x_N \exp\bigg({-\sum_{i,j=1}^N\frac{1}{2}x^\top_i A_{ij}x_j}-\sum_{i=1}^Nh_i x_i\bigg)\prod_{i=1}^Ndx_i, $$ note these are higher order correlation functions for a Gaussian generating functional. Also the matrix $A_{ij}=A^\top_{ij}$ (real symmetric) and is also positive definite, thus the eigenvalues of $A_{ij}$ all satisfy $\lambda_i>0$. Note the generating functional is given by $$ \mathcal{F}(h)=\int_{-\infty}^\infty \exp\bigg({-\sum_{i,j=1}^N\frac{1}{2}x^\top_i A_{ij}x_j}-\sum_{i=1}^Nh_i x_i\bigg)\prod_{i=1}^Ndx_i=\frac{(2\pi)^{N/2}}{\sqrt{\det A_{ij}}}\exp\big( \frac{1}{2}\sum_{i,j=1}^N h_i A^{-1}_{ij}h_j\big) $$ where I used $\det(A)=\prod_{i=1}^N \lambda_i$. We calculate this by finding the minimum of the quadratic form $$ \frac{\partial}{\partial x_k}\bigg(\sum_{i,j=1}^{N} \frac{1}{2} x_i A_{ij} x_j-\sum_{i=1}^N h_i x_i \bigg)=\sum_{j=1}^{N} A_{kj}x_j- h_k=0. $$ In order to solve this we need to introduce the inverse matrix of $A$ given by $A^{-1}$. Thus we can write the solution as $$ x_i=\sum_{j=1}^{N} A^{-1}_{ij} h_j. $$ We can now make a change of variables $x_i \mapsto y_i$ to obtain $$ x_i=\sum_{j=1}^{N} K^{-1}_{ij}h_j+y_i. $$ Re-writing $\mathcal{F}(h)$ we obtain $$ \mathcal{F}(h)=\exp\bigg(\sum_{i,j=1}^{N} \frac{1}{2} h_i A^{-1}_{ij} h_j\bigg)\int_{-\infty}^\infty d^Ny \exp\bigg(-\sum_{i,j=1}^{N} \frac{1}{2} y_i A_{ij}y_j \bigg). $$ This integral is now a simple gaussian which we diagonalize by an orthogonal transformation $A=O\lambda_i\delta_{ij}O^\top$ and a linear change of variables $x=Oy$. The Jacobian of the transformation is unity since a rotation leaves the volume invariant. We write the general result as \begin{equation} \mathcal{F}(h)=\big({2\pi}\big)^{N/2} (\det A_{ij})^{-1/2} \exp\left(\sum_{i,j=1}^{N} \frac{1}{2} h_i A^{-1}_{ij} h_j\right). \end{equation} Having calculated this, I now need to calculate the expected value of the higher order moments which is what my question is.

Note in 1 dimension, the expected value I am trying to calculate is similar to $$ \int_{-\infty}^\infty x^{n} e^{-x^2/2-\alpha x}dx,\quad \Re(n)>-1, \alpha\in \mathbb{R}. $$ I have found lower order expected values given by $$ \big<x_i\big>=A^{-1}_{ij}h_j ,\quad \big<x_i x_j\big>=A^{-1}_{ik} h_k A^{-1}_{jl}h_l+A^{-1}_{ij}, $$ but am trying to generalize to higher orders.

2

There are 2 best solutions below

3
On

You can look up a result of Isserlis, which is proved by him in "On a formula for the product-moment coefficient of any order of a normal frequency distribution in any number of variables". Biometrika 12: 134–139.

Essentially, if $X_1,\ldots,X_n$ follow a zero mean multivariate gaussian distribution, $E[X_1\ldots X_n] = \sum_{\text{Partitions of $\{X_1,\ldots,X_n\}$ into pairs}} \prod_{(X_i,X_j)\text{ is a pair}}E[X_i X_j]$ when $n$ is even (and is 0 when $n$ is odd).

0
On

This problem is not really that hard yet it requires a complicated notation and as such it takes time to write it up in LaTeX. Alright firstly let us fix the indices . We take $1\le m \le N$ and then we have $1 \le i_1 < i_2 < \cdots < i_m \le N$.We denote $B := A^{-1}$ and ${\mathcal C}_N := \sqrt{(2\pi)^N/\det(A)}$ and we are interested in the expectation of a product of $X_{i_1} \cdot X_{i_2} \cdot \dots \cdot X_{i_m}$ as defined in the body of the question. Using the generating functional ${\mathcal F}(\vec{h})$ we can write: \begin{eqnarray} E\left[ X_{i_1} \cdot X_{i_2} \cdot \dots \cdot X_{i_m} \right] = (-1)^m \sqrt{\frac{(2\pi)^N}{\det(A)}} \cdot \left(\frac{\partial}{\partial h_{i_1}} \cdot \cdots \cdot \frac{\partial}{\partial h_{i_m}}\right) e^{\frac{1}{2} \vec{h}^T \cdot B \cdot \vec{h} } \end{eqnarray} All what we need to do is to compute the derivatives by using the chain rule. From the outset it is clear that both the exponential and the square root will always survive the differentiation. We compute the right hand side for particular values of $m$ to discover the pattern. In what follows we use Einstein's summation convention. We have: \begin{eqnarray} E\left[ \prod\limits_{\xi=1}^1 X_{i_\xi} \right] &=& (-1)^1 {\mathcal C}_N e^{\frac{1}{2} \vec{h}^T \cdot B \cdot \vec{h} } \cdot \left[ B_{i_1,\xi} h_\xi\right]\\ E\left[\prod\limits_{\xi=1}^2 X_{i_\xi} \right] &=& (-1)^2 {\mathcal C}_N e^{\frac{1}{2} \vec{h}^T \cdot B \cdot \vec{h} } \cdot \left[ B_{i_1,\xi} B_{i_2,\eta} h_\xi h_\eta + B_{i_1,i_2}\right] \\ E\left[\prod\limits_{\xi=1}^3 X_{i_\xi} \right] &=& (-1)^3 {\mathcal C}_N e^{\frac{1}{2} \vec{h}^T \cdot B \cdot \vec{h} } \cdot \left[ B_{i_1,\xi} B_{i_2,\eta} B_{i_3,\lambda} h_\xi h_\eta h_\lambda + \left( B_{i_1,i_2} B_{i_3,\xi} + B_{i_1,i_3} B_{i_2,\xi} + B_{i_2,i_3} B_{i_1,\xi} \right) h_\xi \right]\\ E\left[\prod\limits_{\xi=1}^4 X_{i_\xi} \right] &=& (-1)^4 {\mathcal C}_N e^{\frac{1}{2} \vec{h}^T \cdot B \cdot \vec{h} } \cdot \left[\right.\\ &&\left. B_{i_1,\xi} B_{i_2,\eta} B_{i_3,\lambda} B_{i_4,\theta} h_\xi h_\eta h_\lambda h_\theta + \right.\\ &&\left. B_{i_1,i_2} B_{i_3,\xi} B_{i_4,\eta} h_\xi h_\eta + \right. \\ &&\left. B_{i_1,i_3} B_{i_2,\xi} B_{i_4,\eta} h_\xi h_\eta + \right. \\ &&\left. B_{i_1,i_4} B_{i_2,\xi} B_{i_3,\eta} h_\xi h_\eta + \right. \\ &&\left. B_{i_2,i_3} B_{i_1,\xi} B_{i_4,\eta} h_\xi h_\eta + \right. \\ &&\left. B_{i_2,i_4} B_{i_1,\xi} B_{i_3,\eta} h_\xi h_\eta + \right. \\ &&\left. B_{i_3,i_4} B_{i_1,\xi} B_{i_2,\eta} h_\xi h_\eta + \right. \\ &&\left. B_{i_1,i_2} B_{i_3,i_4} + B_{i_1,i_3} B_{i_2,i_4} + B_{i_1,i_4} B_{i_2,i_3} \right] \end{eqnarray} It is not hard to see the general pattern now it is just cumbersome to write it up for generaic $m$. Note: As a sanity check we look at the case of $\vec{h}=\vec{0}$. Here all the odd moments , with odd values of $m$ vanish, and all the even ones are the same as Isserlis' theorem is telling us they should be.