An integral related to Gaussian distribution

72 Views Asked by At

I am trying to evaluate the integral

$$\int_{\mathbb{R}^d} e^{i \langle z,x \rangle} e^{- \langle x,A^{-1}x \rangle /2} dx $$

where $A$ is a positive-definite symmetric matrix.

As a first step (which is quite pointless, but just to remove the inverse sign) I did

$$ |\det A|\int_{\mathbb{R}^d} e^{i \langle z, Ax \rangle} e^{- \langle A x,x \rangle /2} dx . $$

What I really want to do now is make some orthogonal change of variables so that all the matrices inside the integral are diagonal. However, I cannot make this calculation work out, regardless of what I try.

Note that it is possible to diagonalize $A$ using an orthogonal matrix, given the hypotheses on $A$.

Many thanks for your help.

EDIT:

Let

$$f_X(x) = \frac{1}{\sqrt{(2\pi)^d\operatorname{det}(A)}}\exp\left(-\frac{1}{2}\langle x, A^{-1}x\rangle\right).$$

We calculate as follows (as per the suggested substitution):

\begin{align} \int_{\mathbb{R}^d} e^{i \langle z,x \rangle} f_X(x)dx &= \frac{1}{\sqrt{(2\pi)^d\operatorname{det}(A)}} \int_{\mathbb{R}^d} e^{i \langle U^T z,y \rangle} e^{\langle y, D^{-1}y \rangle /2} dy \\ &= \frac{1}{\sqrt{(2\pi)^d\operatorname{det}(A)}} \prod_{n=1}^{d} \int_{\mathbb{R}} e^{i (U^T z)_n y_n} e^{-y_n^2/2d_n} dy_n \end{align}

where, for a vector $x \in \mathbb{R}^d$ I write $x = (x_1,...,x_d)$, and where the $d_n$ for $n \in \{1,...,d\}$ are the diagonal entries of $D$.

Continuing from above, using the integral from the $1$-D case, we get

\begin{align} &= \frac{1}{\sqrt{(2\pi)^d\operatorname{det}(A)}} \prod_{n=1}^{d} \exp \left( -\frac{1}{2} d_n (U^T z)^2_n \right) (2 \pi d_n)^{1/2}\\ &=\prod_{n=1}^{d} \exp \left( -\frac{1}{2} d_n (U^T z)^2_n \right)\\ &= \exp \left( -\frac{1}{2} \langle z, Az \rangle \right), \end{align}

as expected.

2

There are 2 best solutions below

0
On BEST ANSWER

Hint: $$ \langle x,A^{-1}x\rangle = x^T A^{-1}x = x^T A^{-1/2}A^{-1/2}x = \langle A^{-1/2}x,A^{-1/2}x \rangle $$ from there, take $u = A^{-1/2}x$.

Alternatively, to use orthogonal matrices: note that if $A = UDU^T$ (with $U$ orthogonal), we have $$ \langle x,A^{-1}x\rangle = \langle x,UD^{-1}U^Tx\rangle = \langle U^Tx,D^{-1}U^Tx\rangle = \langle [U^Tx],D^{-1}[U^Tx]\rangle $$ from there, make the substitution $y = U^T x$.

1
On

Let $X = (X_1, X_2, \ldots, X_d)$ denote a vector of $d$ zero-mean jointly Gaussian random variables with invertible covariance matrix $A$. Then, the joint density is $$f_X(x) = \frac{1}{\sqrt{(2\pi)^d\operatorname{det}(A)}}\exp\left(-\frac{1}{2}\langle x, A^{-1}x\rangle\right)$$ and so we recognize the given integral as having value $cE\left[e^{i\langle z,X\rangle}\right] = cE\left[e^{-iZ}\right] = c\Psi_Z(1)$ where $c = \sqrt{(2\pi)^d\operatorname{det}(A)}$, and $\Psi_Z(\cdot)$ denotes the characteristic function of the random variable $Z = -\langle z, X\rangle = -\sum_i z_i X_i$. But, linear combinations of jointly Gaussian random variables are Gaussian random variables, and so we conclude that $Z$ is a zero-mean Gaussian random variable with variance $\langle z, Az\rangle$. Thus, $\Psi_Z(\omega) = \exp\left(-\frac{1}{2}\langle z, Az\rangle\omega^2\right)$, giving the value of the integral as $$\sqrt{(2\pi)^d\operatorname{det}(A)}\cdot\exp\left(-\frac{1}{2}\langle z, Az\rangle\omega^2\right).$$

I would be interested in knowing what value you obtained via the approach pointed out by @Omnomnomnom.