Show $X_1$ and $X_2$ have a common Gaussian distribution

285 Views Asked by At

Anyone has any idea about the following question?

Let $\Bbb E(X_1^2)$ and $\Bbb E(X_2^2)$ be finite. Show that if $X_1$ and $X_2$ are independent and likewise $X_1+X_2$ and $X_1-X_2$, then both $X_1$ and $X_2$ have a common Gaussian distribution.

There is a hint that "$2x_1 = 2x_1-x_2+x_2$ and $2x_2=2x_2-x_1+x_1$ and then apply central limit theorem four times", but I still have no idea how to do this. Thank you.

2

There are 2 best solutions below

3
On

Let $\phi_1(t)=\mathbb{E}(e^{itX_1})$ and $\phi_2(t)=\mathbb{E}(e^{itX_2})$ be the characteristic functions of $X_1$ and $X_2$ respectively.

Since $2X_1=(X_1+X_2)+(X_1-X_2)$ we conclude by independence that $$\phi_1(2t)=\mathbb{E}(e^{it(X_1+X_2)})\mathbb{E}(e^{it(X_1-X_2)})= \mathbb{E}(e^{it X_1 })\mathbb{E}(e^{itX_2})\mathbb{E}(e^{it X_1}) \mathbb{E}(e^{-itX_2}) $$ That is $$\phi_1(2t)=\phi_1^2(t)\vert \phi_2(t)\vert^2\tag{1} $$ and similarly, Since $2X_2=(X_1+X_2)-(X_1-X_2)$ we get $$\phi_2(2t)=\phi_2^2(t)\vert \phi_1(t)\vert^2\tag{2}$$ From $(1)$ and $(2)$ we see that $\vert\phi_1(2t)\vert =\vert \phi_2(2t)\vert$ for every $t\in\mathbb{R}$ and consequently $$\forall\,t\in \mathbb{R},\qquad \vert\phi_1(t)\vert =\vert \phi_2(t)\vert\tag{3}$$ Thus $$\vert\phi_k(t)\vert=\vert\phi_k(\frac{t}{2})\vert^4,\quad\hbox{ for $t\in\mathbb{R}$, and $k=1,2$}.\tag{4}$$

  • If $X_1$ is constant then $\vert\phi_1(t)\vert\equiv1$ and from $(3)$ we get $\vert\phi_2(t)\vert\equiv1$ and consequently $X_2$ is also constant. The converse is also true. So, we get the trivial solution where $X_1$ and $X_2$ are constant random variables.
  • So let us suppose that $X_1$ is not a constant random variable. So Now, Since $X_1$ is square integrable, then, in the neighborhood of $~0$, we have $$\phi_1(t)=1+i\mu_1 t+\frac{s_1^2}{2}t^2 +o(t^2)$$ where $\mu_1=\mathbb{E}(X_1)$ and $s_1^2=\mathbb{E}(X_1^2)$. From $(4)$ we have $$\vert\phi_1(t)\vert=\left\vert\phi_1\left(t2^{-n}\right)\right\vert^{4^n}=\left\vert\left(1+i\frac{\mu_1t}{2^n}-\frac{s_1^2t^2}{2}\cdot\frac{1}{4^n}+o(4^{-n})\right)^{4^n}\right\vert$$ Taking the limit as $n$ tend to $+\infty$ we see that, $$\forall\,t\in\mathbb{R},\qquad\vert\phi_1(t)\vert=e^{-\sigma_1^2t^2/2}$$ where $\sigma_1^2=s_1^2-\mu_1^2$. by $(3)$ we see that $$\forall\,t\in\mathbb{R},\qquad\vert\phi_2(t)\vert=e^{-\sigma_1^2t^2/2}$$ In particular ${\rm var}(X_2)={\rm var}(X_1)=\sigma_1^2=\sigma^2$.

Going back to $(1)$ we see that for every $t$ and $n$ we have $$\phi_1(t)=\left(\phi_1\left(\frac{t}{2^n}\right)\right)^{2^n}e^{-\sigma^2(1-2^{-n})t^2/2}$$ taking the limit as $n$ tends to $+\infty$ we get $\phi_1(t)=e^{i\mu_1 t-\sigma^2t^2/2}$, and similarly we obtain $\phi_2(t)=e^{i\mu_2 t-\sigma^2t^2/2}$. So, both $X_1$ and $X_2$ are Gaussian variables, with the same variance.

Conversely, If $X_1$ and $X_2$ are independent Gaussian variables with the same variance then $${\rm cov}(X_1-X_2,X_1+X_2)=\mathbb{E}(X_1^2-X_2^2)-(\mathbb{E}(X_1-X_2) \mathbb{E}(X_1+X_2)={\rm var}(X_1)-{\rm var}(X_2)=0$$ and consequently, the are independent. (This is only true for Gaussian variables).

6
On

This result (without the assumption of finite second moments) is due to Mark Kac: On a characterization of the normal distribution, dating to 1939. When $X_1$ and $X_2$ are symmetric, the approach hinted at in your question leads one to show that, for $n=1,2,\ldots$, the random variable $X_1$ has the same distribution as $$ {X_{n,1}+X_{n,2}+\cdots+X_{n,2^n}\over 2^{n/2}} $$ (where the $X_{n,k}$ are iid with the same distribution as $X_1$), following which the CLT shows that $X_1\sim\mathcal N(0,\sigma^2)$ for $\sigma^2=\Bbb E[X_1^2]$. Likewise for $X_2$.

Kac then relates an argument of A. Wintner showing how to remove the symmetry assumption, using a (non-trivial) theorem of H. Cramér (which states that if $X$ and $Y$ are independent random variables such that $X+Y$ is normally distributed, then both $X$ and $Y$ are normally distributed). It would be quite interesting if a direct CLT argument could be made, bypassing Cramér's result.