Independence of two variables

676 Views Asked by At

I ran into some problems while doing an exercise. The problem goes as follows:

Suppose we have two random independent variables $X$ and $Y$. Both are distributed normally with parameters $(0, \sigma^2)$. $\mathbb{P}(dx)=\frac{1}{\sigma\sqrt{2\pi}} \exp^{-\frac{x}{2\sigma^2}}dx$. For $\gamma \in \mathbb{R}$, we set $U = X \cos\gamma - Y\sin\gamma$ and $V = X \sin\gamma + Y\cos\gamma$. Show that $U$ and $V$ are independent, calculate their distribution function.

What I've tried:

I know that to check the independence I need to use $$\mathbb{E}(\varphi (X) \psi (Y) )= \mathbb{E}(\varphi(X)) \cdot \mathbb{E}(\psi(Y)) $$ For that I need to calculate $\mathbb{P}_U$, $\mathbb{P}_V$ and $\mathbb{P}_{(U,V)}$. There are two ways to do that, either pushforward measure or density function. So I'm stuck at calculating $\mathbb{P}_U$ since for pushforward measure I can't express $X$ and $Y$ by using only $U$ or $V$. And for density function I have a problem with trigonometric functions since it changes the sign according to the quadrant and so does an inequality $\mathbb{P}(X \cos\gamma - Y\sin\gamma\leq t)$.

Thanks in advance

2

There are 2 best solutions below

4
On BEST ANSWER

It is straightforward to compute the joint density of $(U,V)$ from that of $(X,Y)$. Jacobians and the like are involved in the standard undergraduate treatment of this topic (which is often not understood very well by said undergraduates). In this instance, the Jacobian approach is easier since the transformation is linear. Even more strongly for this particular problem, the answer can be written down with nary a mention of Jacobians, expectations, and the like. The transformation in question is a rotation of axes, and since the joint density $f_{X,Y}(x,y)$ has circular symmetry about the origin, rotating the axes does not change the function: the joint density $f_{U,V}$ is the same function as $f_{X,Y}$, that is, $$f_{U,V}(u,v) = \frac{1}{2\pi\sigma^2}\exp\left(-\frac{u^2+v^2}{2\sigma^2}\right), -\infty < u, v < \infty$$ and the independence of $U$ and $V$ follows immediately: $$f_{U,V}(u,v) = \frac{1}{\sigma\sqrt{2\pi}}\exp\left(-\frac{u^2}{2}\right) \cdot \frac{1}{\sigma\sqrt{2\pi}}\exp\left(-\frac{v^2}{2}\right) = f_X(u)f_Y(v)$$

12
On

To check independence of U and V you actually need $ E(UV)=E(U)E(V)$. Since U and V are both linear combinations of X and Y even knowing that X and Y are independent does not make the independence of U and V obvious.

Now E(UV)= E([X cosγ - Y sinγ][X sinγ + Ycosγ]) expanding this we get

E(X$^2$ sinγ cosγ - Y$^2$ sinγ cosγ +XY cos$^2$γ -XY sin$^2$γ)=

E[(X$^2$-Y$^2$) sinγ cosγ +XY(cos$^2$γ - sin$^2$γ)]

Now we recognize the double angle indentities from trigonometry

cos2γ = cos$^2$γ - sin$^2$γ and

sin2γ = 2 sinγ cosγ

substituting in we get E[(X$^2$-Y$^2$) sin2γ/2 + XY cos2γ]= E[(X$^2$-Y$^2$) sin2γ + 2XY cos2γ]/2= (sin2γ/2) E[X$^2$-Y$^2$] + cos2γ E[XY]= (sin2γ/2) (E[X$^2$]-E[Y$^2$])+ cos2γ E[XY] Assuming X and Y are independent we get

(1) (sin2γ/2) (E[X$^2$]-E[Y$^2$])+ cos2γ E[X] E[Y]

Now consider E[U] E[V] =E[X cosγ - Y sinγ]E[X sinγ + Y cosγ] =

(cosγ E[X] - sinγ E[Y])(sinγ E[X] + cosγ E[Y])=

sinγ cosγ E$^2$[X] + (-sin$^2$γ + cos$^2$γ)E[X] E[Y] - sinγ cosγ E$^2$[Y]

Applying the double angle formulae here we get

sinγ cosγ E$^2$[X] + cos2γ E[X] E[Y] - sinγ cosγ E$^2$[Y]=

(2) cos2γ E[X] E[Y] + [(sin2γ)/2] (E$^2$[X] - E$^2$[Y]).

This is the same as equation 1. Hence E[UV]=E[U] E[V] and U and V are independent.