Using characteristic functions to determine distribution of sum of independent normal random variables.

262 Views Asked by At

There is a bijective correspondence between characteristic functions and probability distributions. It is stated in Probability Theory by Durrett that from this fact it follows readily that given independent normal random variables $X_1$, $X_2$ with mean $0$ and variances $\sigma_1^2$ and $\sigma_2^2$ respectively, the sum $X_1+X_2$ is a normal random variable with mean $0$ and variance $\sigma_1^2+\sigma_2^2$. I imagine this proof is something like:

The ch. f. of $X_1$ is $$\phi_1(t)= \text{exp}\Big(\frac{-\sigma_1^2 t^2}{2}\Big).$$

The ch. f. of $X_2$ is $$\phi_2(t)= \text{exp}\Big(\frac{-\sigma_2^2 t^2}{2}\Big).$$

The ch. f. of $X_1+X_2$ is $$\phi_1(t)\phi_2(t)= \text{exp}\Big(\frac{-\sigma_1^2 t^2}{2}\Big)\text{exp}\Big(\frac{-\sigma_2^2 t^2}{2}\Big)=\text{exp}\Big(\frac{-(\sigma_1+\sigma_2^2) t^2}{2}\Big),$$

which is the ch.f. of a normal random variable with variance $\sigma_1^2+\sigma_2^2$, and thus, as a ch. f. determines a unique distribution, $X_1+X_2$ must be a normal random variable with variance $\sigma_1^2+\sigma_2^2$.

The only thing that troubles me is that it seems strange to me that the exact same proof seems to go through without the assumption of $EX_1=EX_2=0$. Taking $\mu_1=EX_1$ and $\mu_2=EX_2$ we get that:

The ch. f. of $X_1$ is $$\phi_1(t)= \text{exp}\Big(\frac{-\sigma_1^2 t^2}{2}+i\mu_1t\Big).$$

The ch. f. of $X_2$ is $$\phi_2(t)= \text{exp}\Big(\frac{-\sigma_2^2 t^2}{2}+i\mu_2t\Big).$$

The ch. f. of $X_1+X_2$ is $$\phi_1(t)\phi_2(t)= \text{exp}\Big(\frac{-(\sigma_1+\sigma_2^2) t^2}{2}+(\mu_1+\mu_2)it\Big),$$

which is the ch. f. of a normal random variable with variance $\sigma_1^2+\sigma_2^2$ and mean $\mu_1+\mu_2$, and thus, as a ch. f. determines a unique distribution, $X_1+X_2$ must be a normal random variable with variance $\sigma_1^2+\sigma_2^2$ and mean $\mu+\mu_2$.

Is there something wrong with this second argument?