The correlation between two random variables

126 Views Asked by At

Let $X_{1}$, $X_{2}$, and $B$ be independent random variables and $P(B=-1)=P(B=1)=\frac{1}{2}$. Let $Y_{1}=B\left\vert X_{1}\right\vert$ and $Y_{2}=B\left\vert X_{2} \right\vert$. Let $Y_{1} \sim N(0,1)$, $Y_{2} \sim N(0,1)$, Show that $\rho(Y_{1},Y_{2})=\frac{2}{\pi}$

The first thing that I thought of was using the correlation formula. So,

$\begin{align} \rho(Y_{1},Y_{2})&= \frac{Cov(Y_{1},Y_{2})}{\sqrt{Var(Y_{1}).Var(Y_{2})}}\\ &= \frac{E(Y_{1}Y_{2})-E(Y_{1})E(Y_{2})}{\sqrt{(E(Y^2_{1})-(E(Y_{1}))^{2}(E(Y^2_{2})-(E(Y_{2}))^{2}}} \end{align} $

However, I am not sure how would I obtain all these components. I was wondering if someone would help me to get started. Thanks in advance!

Updates :

Using Law of total expectation

$\begin{align} E\left( Y_{1}\right)&= E\left( Y_{1} \, \vert B=1 \right)\, . \, P\left(B=1\right)+P\left( Y_{1} \, \vert B=-1 \right)\, . \, P\left( B=-1\right) \\ &=E\left( Y_{1} \, \vert B=1 \right)\, . \, \frac{1}{2}+E\left( Y_{1} \, \vert B=-1 \right)\, . \, \frac{1}{2}\\ &= ? \end{align} $

Where would I go from here?

1

There are 1 best solutions below

3
On BEST ANSWER

Basically, $Y_1$ and $Y_2$ have normal distributions, but they have the same sign. To show that the $Y_i$ are distributed $N(0,1)$, we can show that $Y_1$ has the same cumulative distribution as $X_1$. Let $C_Y$ be the cumulative distributions of $Y_1$ and let $C_X$ be the cumulative distribution of $X_1$.

First assume $y<0$. Then $$C_Y(y)= P(Y<y) = P( B=-1\mathrm{\ and\ } |X_1|>|y|)$$ $$ = P( B=-1) \cdot P( |X_1|>|y|)$$ $$ =1/2 \cdot (P( X_1>|y| ) + P( X_1< y ))$$ $$ =1/2 \cdot (2 P( X_1< y )$$ $$ = P( X_1< y )$$ $$ = C_X(y).$$

If $y>0$, then $$C_Y(y)= 1- P(Y>y) = 1- P( B=1\mathrm{\ and\ } |X_1|>y)$$ $$ =1- P( B=1) \cdot P( |X_1|>y)$$ $$ =1- \frac12 \cdot (P( X_1>y ) + P( X_1< -y ))$$ $$ =1 -\frac12 \cdot (2 P( X_1< -y )$$ $$ = 1-P( X_1< -y )$$ $$ = 1-C_X(-y)$$ $$ = C_X(y).$$ For each equality above you should supply a reason.

We have shown that the cumulative distribution of $Y_1$ is the same as the cumulative distribution of $X_1$ which is distributed N(0,1). Very similar reasoning shows that $Y_2$ is distributed N(0,1).

You need to go through the work to find the covariance. Their standard deviations are 1 and their means are 0 because they are N(0,1) distributed.
$$Cov(Y_1,Y_2) = \int_{y_1=0}^\infty\int_{y_2=0}^\infty y_1 y_2 \;2 f(y_1)\; f(y_2)\; dy_1 dy_2$$ $$+ \int_{y_1=-\infty}^0\int_{y_2=-\infty}^0 y_1 y_2 \;2 f(y_1)\; f(y_2)\; dy_1 dy_2$$ $$= 4\int_{y_1=0}^\infty\int_{y_2=0}^\infty y_1 y_2 \;f(y_1)\; f(y_2)\; dy_1 dy_2$$ $$ = 4\int_{y_1=0}^\infty\int_{y_2=0}^\infty \frac{ y_1 y_2}{2 \pi} \exp(-y_1^2/2) \exp(-y_2^2/2)\; dy_1 dy_2 $$ $$= 4\int_{r=0}^\infty\int_{\theta=0}^{\pi/2} \frac{ r^2 \sin(\theta) \cos(\theta)}{2 \pi} \exp(-r^2/2)\; r\; d\theta dr $$ $$=\int_{r=0}^\infty\int_{\theta=0}^{\pi/2} \frac{ r^2 \sin(2 \theta)}{ \pi} \exp(-r^2/2)\; r\; d\theta dr $$ $$=\int_{r=0}^\infty r^3\exp(-r^2/2)\; dr \cdot \int_{\theta=0}^{\pi/2} \frac{ \sin(2 \theta)}{ \pi}d\theta $$ $$= 2\cdot \frac{ 1}{ \pi}= 2/\pi $$ where $f(x)$ is the pdf of the N(0,1) distribution. On the first line, the 2 in front of $f(y_1) f(y_2)$ appears because we are only integrating over quadrants I and IV. To compute $\int_0^\infty r^3 \exp(-r^2/2)\; dr$ use integration by parts.