The sum and difference of two independent random variables is independent

1.7k Views Asked by At

I was reading the book "Brownian motion" by Peres, Mörters and in the demostration of the existance (demost Theo 1.3) I was confused by this paragraph:

(Notation: $D_k$ is the set of dyadic intervals in [0,1] of lenght $2^{-k}$, $d\in\cup_{k=0}^{\infty} D_k \setminus D_n$, $Z_t\sim \mathcal{N}(0,1)$, the family $\{Z_t\; t\in \cup_{n=0}^{\infty} D_n\}$ is independent and $B(d)=\frac{B(d+2^{-n})+B(d-2^{-n})}{2}+\frac{Z_d}{2^{(n+1)/2}}$)

... as $\frac{1}{2}[B(d+2^{-n})-B(d-2^{-n})]$ depends only on $(Z_t \;: t\in D_{n-1})$, it is independent of $Z_d/2^{(n+1)/2}$. By our induction assumtions both terms are normally distributed with mean zero and variance $2^{-(n+1)}$. Hence their sum $B(d)-B(d-2^{-n})$ and difference $B(d+2^{-n})-B(d)$ are independent and normally distributed with mean zero and variance $2^{-n}$.

So far, I have checked that $\frac{1}{2}[B(d+2^{-n})-B(d-2^{-n})]\sim \mathcal{N}(0,2^{-(n+1)})$ as well as $Z_d/2^{(n+1)/2}\sim \mathcal{N}(0,2^{-(n+1)})$ and that if $X:=\frac{1}{2}[B(d+2^{-n})-B(d-2^{-n})]$ and $Y:=Z_d/2^{(n+1)/2}$ then $B(d)-B(d-2^{-n})=X+Y$ and $B(d+2^{-n})-B(d)=X-Y$ and that both of them have the claimed distribution $\mathcal{N}(0,2^{-n})$ and $X\perp Y$.

So my question is: How can they conclude that $\underline{if \; X\perp Y \; \Rightarrow \; (X+Y) \perp (X-Y)}$? I was thinking it might have something to do with the fact that if $R_1\perp R_2$ and $f_1$ and $f_2$ are measurable functions then $f_1(R_1)\perp f_2(R_2)$ but I am not sure how to show this formally or even if it is true in general or if it is true because of some properties of the normal distribution/brownian motion.

1

There are 1 best solutions below

1
On BEST ANSWER

For Normal random variables, you can see this by computing moment generating functions. If $X \sim N(\mu_1, \sigma^2)$ and $Y \sim N(\mu_2,\sigma^2)$ are independent, then $$ M_{X+Y,X-Y}(t,s) = E[e^{t(X+Y) + s(X-Y)}] = E[e^{(t+s)X}] E[e^{(t-s) Y}] = e^{(t+s)\mu_1 + \frac{(t+s)^2 \sigma^2}{2}} e^{(t-s)\mu_2 + \frac{(t-s)^2 \sigma^2}{2}} = e^{t(\mu_1 + \mu_2) + t^2 \sigma^2} e^{s(\mu_1 - \mu_2) + s^2 \sigma^2},$$

which is the moment generating function of independent random variables which are marginally $N(\mu_1 + \mu_2, 2 \sigma^2)$ and $N(\mu_1 - \mu_2, 2\sigma^2)$.

In fact, the reverse implication holds as well. This converse direction is called the Kac-Bernstein theorem, which says that if $X$ and $Y$ are independent and $X+Y$ and $X-Y$ are independent, then $X$ and $Y$ are both marginally Normally distributed.