Let $X\sim N(0,1)$ and $Y=(-1)^J \cdot X$, where $P(J=1)=\frac{1}{2}=P(J=0)$. Furthermore X and J are independent.
I already showed that $Y\sim N(0,1)$ by calculating the cdf of Y. Now I want to show that X,Y are dependent and uncorrelated.
I tried showing the dependence by giving a counterexample for independence, but I'm not sure about it:
$P(X\le-1,Y\le-1)=P(X\le-1,J=0)=P(X\le-1)P(J=0)=\frac{1}{2}P(X\le-1)$ $$P(X\le-1)P(Y\le-1)=(P(X\le-1))^2$$.
We see they're obviously not the same, so not independent.
For the Correlation, I first wanted to calculate the covariance:
$$\mathrm{Cov}(X,Y)=E(XY)-E(X)E(Y)=E(XY)$$ but at this point I'm stuck because I don't know how to compute E(XY), we haven't proven Total law of expectation yet, so I'm not allowed to use it here and we also haven't spoken about Chi^2-distribution.
So to sum up I have two questions: 1.) Is my approach for the independence of X,Y correct? 2.) How can I calculate E(XY)?
Thanks in advance!
Yes! Your proof of dependence is correct.
To compute the covariance, you could try the following: $$\mathbb{E}[XY] = \mathbb{E}[(-1)^{J}X^2] = \mathbb{E}[(-1)^{J}]\mathbb{E}[X^2],$$ where the last equalty follows by independence.
Note that to compute each expectation you may use the fact that for any random variable $X$ and nice enough function $g$: $$\mathbb{E}[g(X)] = \begin{cases} \sum_{x\in \operatorname{Rg}(X)} g(x) p_X(x) &\text{ if }X\text{ is discrete, and}\\ \int_{\mathbb{R}} g(x) f_X(x) dx &\text{ if }X\text{ is (absolutely) contninuous}\\ \end{cases}$$ to finish the proof. Both $x\mapsto x^2$ (because it is Borel-measurable) and $a\mapsto (-1)^{a}$ can be used with this result.