Characteristic function of a random vector

2.6k Views Asked by At

We consider the random vector $ X\colon \Omega \to \mathbb {R}^n$ defined on the probability space $(\Omega, \mathfrak F, P)$. Let denote by $\Phi_{X}(x) = \mathbb E(e^{i\left<x, X\right>})$ its characteristic function.

I would like to show the following equivalence: $X$ is a Gaussian vector if and only if $\Phi_{X}(x)$ is given by $$\Phi_{X}(x)= e^{i\left<m, x\right> -\frac{1}{2}\left<A x, x\right>} \qquad (*),$$ where $m=(\mathbb E(X_1), \dots, \mathbb E(X_n))$ and $A=Cov(X)$.

I showed the direct sense (i.e. if $X$ is a Gaussian vector then $\Phi_{X}(x)$ is given as $(*)$). In fact, I have used the following $\Phi_{X}(x)=\Phi_{Z_x}(1) = \exp\{im_{x} - \frac{1}{2}\sigma_{x}^2\} = ...$, where $Z_x=\sum_{j=1}^{n}x_j X_j$ is a random variable in $N(m, \sigma^2)$ ....

Now, I need help for the opposite direction.

Thank you in advance

3

There are 3 best solutions below

3
On

Hint: fix $a := (a_1,\ldots,a_n)$ and find the characteristic function of $\langle a, X\rangle.$ Hover below for a full answer.

Then the characteristic function of $a_1X_1 + \cdots + a_n X_n$ is \begin{align*}\Phi_{\langle a, X \rangle}(t) &= \mathbb{E}(e^{it\langle a, X \rangle}) \\&= \Phi_X(ta) \\&= \exp\left(i \langle m, ta\rangle - \frac{1}{2}\langle A(ta),ta\rangle \right) \\&= \exp\left(i \langle m,a\rangle t - \frac{1}{2}\langle Aa,a\rangle t^2\right).\end{align*} This last equation is the characteristic function of $N(\langle m,a\rangle, \langle Aa,a\rangle).$

0
On

The converse follows immediately from Levy's Continuity Theorem, which in particular implies that any two random variables with the same characteristic function are equal in distribution.

0
On

Conversely, suppose $X\in\mathbb{R}^{d}$ is a random vector such that for every $b\in\mathbb{R}^{d}$, the random variable $Z:=<b,X>$ is Gaussian.

Denote the mean of $X$ to $\mu$ and the covariance matrix of $X$ to be $\Sigma$. Then, let us compute the mean and variance for $Z$. $$\mathbb{E}Z=\mathbb{E}b^{\intercal}X=b^{\intercal}\mathbb{E}X=b^{\intercal}\mu.$$

To compute the variance, note that $<b,X>=<X,b>$, equivalently $b^{\intercal}X=X^{\intercal}b$, and note that they are now just real values, not vectors.

Therefore, we have \begin{align*} Var(Z)=Var(X^{\intercal}b)&=\mathbb{E}(X^{\intercal}b)^{2}-(\mathbb{E}X^{\intercal}b)^{2}\\ &=\mathbb{E}\Big[(X^{\intercal}b)^{\intercal}(X^{\intercal}b)\Big]-\mathbb{E}(X^{\intercal}b)\mathbb{E}(X^{\intercal}b)\\ &=\mathbb{E}(b^{\intercal}X X^{\intercal}b)-\mathbb{E}(b^{\intercal}X)\mathbb{E}(X^{\intercal}b)\\ &=b^{\intercal}\mathbb{E}(XX^{\intercal})b-b^{\intercal}\mathbb{E}X\mathbb{E}X^{\intercal}b\\ &=b^{\intercal}(\mathbb{E}XX^{\intercal}-\mathbb{E}X\mathbb{E}X^{\intercal})b\\ &=b^{\intercal}Cov(X)b\\ &=b^{\intercal}\Sigma b. \end{align*}

Thus, $Z$ is a Gaussian random variable with mean $b^{\intercal}\mu$ and variance $b^{\intercal}\Sigma b$, so it has characteristic function $$\varphi_{Z}(t)=e^{ib^{\intercal}\mu t-\frac{1}{2}b^{\intercal}\Sigma b t^{2}},$$ and in particular $$\varphi_{Z}(1)=e^{ib^{\intercal}\mu -\frac{1}{2}b^{\intercal}\Sigma b},$$ but $$\varphi_{Z}(1)=\mathbb{E}e^{iZ}=\mathbb{E}e^{i<b,X>}=\varphi_{X}(b).$$