I am reading relation between correlation and independence from the wiki: Link
In the site I linked above, I got challenged.
Definition of random variables $X, W, Y$:
\begin{align}X&\sim N(0,1)\\P(W=-1)&= P(W=1)=0.5\\Y&=WX\end{align}
$ \begin{align} \mbox{cov}(X,Y)&=E(XY)-E(X)E(Y)=E(XY)=E(E(XY|W))\\ &=E(X^2)\mbox{Pr}(W=1)+E(-X^2)\mbox{Pr}(W=-1)\\ &=1\cdot\frac12 + (-1)\cdot \frac12 = 0 \end{align} $
- $E(E(XY|W)) \longrightarrow E(X^2)\mbox{Pr}(W=1)+E(-X^2)\mbox{Pr}(W=-1)$ ?
$\begin{align} \mbox{Pr}(Y\le X)&=E(\mbox{Pr}(Y\le x|W))\\ &=\mbox{Pr}(X\le x)\mbox{Pr}(W=1) + \mbox{Pr}(-X \le x)\mbox{Pr}(W = -1)\\ &=\Phi(x)\frac12 + \Phi(x)\frac12 = \Phi(x) \end{align}$
where $\Phi(x)$ is the c.d.f. of the normal distribution.
- $E(\mbox{Pr}(Y\le x|W)) \longrightarrow \mbox{Pr}(X\le x)\mbox{Pr}(W=1) + \mbox{Pr}(-X \le x)\mbox{Pr}(W = -1)$ ?
Can someone explain 1 and 2? I want to know its process omitted.
For (1.) : In general, $E(XY|W)$ is a random variable that depends on the conditioning variable. That is, $E(XY|W)=g(W)$.
Also, in general, you know that $E(A)=\sum a \, p(A=a)$ and $E(g(A))=\sum g(a) \, p(A=a)$
Then $E[E(XY|W)]=E[g(W)]=\sum g(w) P(W=w)$. But in our case $W$ can take two values, hence $$E[E(XY|W)]= g(1) P(W=1)+g(-1) P(W=-1) $$
Concretely, $g(1)=E(X Y|W=1)= E(X\, X\, W | W=1) =E(X^2)$, etc
For (2.) : It's the same thing. Now the function is $g(W)=\mbox{Pr}(Y\le x|W)$