I need prove this for my homework:
Let $X$ and $Y$ be independent random variables such that $E(X)=E(Y)=0$ and $ E(X^{2})=E(Y^{2})=1$. If $X+Y$ and $X-Y$ are independent the $X,Y$ ~$N(0,1)$.
I have this:
I can express $X=\dfrac{1}{2}((X+Y)+(X-Y))$, then:
$\phi_{X}(t)=E[e^{\frac{1}{2}it(x+y)}]E[e^{\frac{1}{2}it(x-y)}]\\ =E[e^{\frac{1}{2}itx}]E[e^{\frac{1}{2}ity}]E[e^{\frac{1}{2}itx}]E[e^{-\frac{1}{2}ity}]\\ =(E[e^{\frac{1}{2}itx}])^{2}E[e^{\frac{1}{2}ity}]E[e^{-\frac{1}{2}ity}]\\ =(\phi_{X}(t/2))^{2}\phi_{Y}(-t/2)\phi_{Y}(t/2)$
I do not know how to come to the conclusion that X converge in distribution to $N(0,1)$
Here's one way that I found, though it may not be the nicest. It starts with the observation that you made, namely: $$ \phi_X(t) = \phi_X(t/2) \, \phi_X(t/2) \, \phi_Y(t/2) \, \phi_Y(-t/2) . $$ You could interpret this as saying that $X$ has the same distribution as $$ \frac{X_1 + X_2 + Y_1 - Y_2}{2}, $$ where $X_1, X_2, Y_1, Y_2$ are all independent and where the $X_i$'s have the same distribution as $X$ (resp., the $Y_i$'s have the same distribution as $Y$). Now the idea is that we may further divide each $X_i$ and $Y_i$ into a sum of four independent random variables, so that $X$ has the same distribution as a sum of sixteen random variables. Half of these are $X$'s, half are $Y$'s, and some have plus signs while others have minus signs. This suggests that $X$ may be normal, since it can be expressed as an "average of many random variables," akin to the Central Limit Theorem. The problem is that the CLT itself does not apply. Now, if we were in the nicer situation of knowing $$ X \overset{d}{=} \frac{X_1 + \dotsb + X_n}{\sqrt n}$$ for any $n$, where again the $X_i$'s are independent copies of $X$, then we could invoke the CLT to say that the right-hand side converges in distribution to a standard normal $Z \sim N(0,1)$; but $X$ is equal to this limit in distribution, so $X$ is standard normal.
However, for our situation of mixed $X$'s and $Y$'s with mixed signs, we need a more general result. Specifically, the Lindeberg CLT tells us when a "large average of random variables converges in distribution to normal," where the r.v.'s need not have the same distribution (a priori, $\pm X$ and $\pm Y$ don't). However, we must ensure that we don't have some random variables contributing too much to the variance of the average: this is the idea behind the "Lindeberg condition" in the link above. To apply it to this problem, take the sequence $s_1 Z_1, \dotsc, s_{4^n} Z_{4^n}$ (remember that we obtained sixteen random variables from $n=2$ division operations), where each $Z_i$ has either the distribution of $X$ or $Y$, and where $s_i = \pm 1$, you'll get $\mu_i = 0$ and $\sigma_i^2 = \mathrm{Var}(s_i Z_i) = 1$ for each $i$, so that $$ s_{4^n}^2 = \sum_{i=1}^{4^n} 1 = 4^n .$$ You can then verify the Lindeberg condition to conclude that $X$, which is equal in distribution to $$ \frac{1}{s_{4^n}} \sum_{i=1}^{4^n} s_i Z_i = \frac{s_1 Z_1 + \dotsb + s_{4^n} Z_{4^n}}{2^n} ,$$ also converges in distribution to the standard normal distribution.