Sufficient condition for Independence of two random variables

172 Views Asked by At

$X$, $W$, and $Y=\frac{X}{\sigma_x}-\frac{ρ(x, w)*W}{\sigma_w}$ are normal random variables with probability density functions $f(x)$, $f(w)$ and $f(y)$. $X$ and $W$ are bivariate normal, $f(x) * f(y)$ has a bivariate normal distribution with $ρ(x,y)=0$. I'm wondering how I can conclude from above that X and Y are independent. Also is there a way to derive $f(x,y)$?

2

There are 2 best solutions below

5
On

They need not be independent. For example if $X$ has $N(0,1)$ distribution and $Y=X$ then $f=g$ and $f(x)g(y)$ is the density of $(U,V)$ where $U,V$ are i.i.d. standard normal variables. But $X$ and $Y$ are not independent.

0
On

Note that direct calculations give:

$$ {\rm cov}(Y,W) = 0 $$

$$ {\rm cov}(Y,X) = \sigma_{X}(1-\rho_{X,W}^2) $$

So, $X$ and $Y$ can't be independent for $\rho_{X,W} \not= \pm1$.

One general way to go after the joint pdf $f_{X,Y}$ is to compute the joint characteristic function:

$$\phi_{X,Y} (t,u)= \mathbf{E}\left[\exp\left( itX + iuY \right)\right] $$

$$ = \mathbf{E}\left[\exp\left( i(t +u\sigma_X^{-1})X - i (u\rho_{X,W}\sigma_W^{-1} )W \right)\right] $$

$$ = \phi_{X,W} \left(t +u\sigma_X^{-1}, -u\rho_{X,W}\sigma_W^{-1}\right) $$

Note that $\phi_{X,W}$ is explicit for joint normal pairs like $(X,W)$.

Using inversion formula for Fourier transform, one can then recover the joint pdf:

$$ f_{X,Y} (x,y)= (4\pi^2)^{-1} \iint_{\mathbb{R}^2} \phi_{X,Y} (t,u) \exp (-it x -i uy ) dt du. $$

I'm not sure if last integral is explicit.