How can I prove that $YZ$ and $Y$ are not independent random variables?

67 Views Asked by At

Let $Y\in L^1(\Bbb{P})$. Pick $Z$ s.t. $\Bbb{P}(Z=\pm 1)=1/2$ and s.t $Z$ is independent from $Y$. Now define $X:=YZ$. I want to prove that $X$ and $Y$ are not indepenent.

My idea was the following:

Proof $$\begin{align} \Bbb{P}(X=1,Y=1)&=\Bbb{P}(YZ=1,Y=1)\\&= \Bbb{P}(Z=1)\\&=1/2\end{align}$$ But $$\begin{align} \Bbb{P}(X=1)\Bbb{P}(Y=1)&=\Bbb{P}(YZ=1)\Bbb{P}(Y=1)\\&=\Bbb{P}(Y=1)^2\Bbb{P}(Z=1)\\&=1/2\cdot\Bbb{P}(Y=1)^2\neq 1/2\end{align}$$ where in the second equality I use that $Y$ is independent from $Z$ to write $\Bbb{P}(YZ=1)=\Bbb{P}(Y=1)\Bbb{P}(Z=1)$.

Does this work or not?

2

There are 2 best solutions below

3
On

The second equality in the proposed proof is wrong: $$ \Bbb{P}(YZ=1,Y=1) \ne \Bbb{P}(Z=1) \,,$$ Unless $Y$ is the constant 1.

What you are trying to prove is false without further assumptions. Indeed if $Y$ has the same law as $Z$, then $X,Y$ are independent.

To get the desired conclusion, add the hypothesis that $|Y|$ is not almost surely constant.

0
On

In my comment I didn't state one other condition, hence I'll write a full answer so as to make my comment explicit. Yuval Peres rightfully pointed out that the second equality in our proof was not correct. In fact, let us write it again:

\begin{align*} P(X = 1, Y = 1) & = P(Y \cdot Z = 1, Y = 1) \\[3pt] & = P(Z = 1, Y = 1) \\[3pt] & = P(Z = 1) \cdot P(Y = 1) = 1/2 \cdot P(Y = 1) \end{align*}

where the second equality holds by rewriting the set inside the probability and the third, because $Z$ is independent of $Y$. Your second derivation is also incorrect. Notice that

\begin{align*} P(X = 1) \cdot P(Y = 1) & = P(Y \cdot Z = 1) \cdot P(Y = 1) \\[3pt] & = P((Y = 1, Z = 1) \vee (Y = - 1, Z = - 1)) \cdot P(Y = 1) \\[3pt] & = (P(Y = 1, Z = 1) + P(Y = - 1, Z = - 1)) \cdot P(Y = 1) \\[3pt] & = (P(Y = 1) \cdot P(Z = 1) + P(Y = - 1) \cdot P(Z = - 1)) \cdot P(Y = 1), \end{align*}

where the second equality is the crucial change. Notice that, if $YZ = 1$, then ($Y = 1$ and $Z = 1$) or ($Y = - 1$ and $Z = - 1$). The third and fourth equalities use finite additivity and independence, respectively. Ok, now to the assumptions: Assume that

  • $0 < P(Y = 1) < 1$;

  • $1 \neq P(Y = 1) + P(Y = - 1)$.

Then

\begin{equation} P(X = 1, Y = 1) \neq P(X = 1) \cdot P(Y = 1), \end{equation}

otherwise (if equality held), then $1 - P(Y = 1) = P(Y = - 1)$, which is a contradiction. This proves that $X$ and $Y$ are not indepedent.

If you are unsure about this contradiction, try equating $P(X = 1, Y = 1)$ to $P(X = 1) \cdot P(Y = 1)$ and, under the assumptions that I gave (and also $P(Z = \pm 1) = 1/2$), try eliminating as many terms as possible.

Maybe there is a smaller set of conditions that guarantee independence of $X$ and $Y$, but I'm unsure of it right now. I'll think about it and, if I figure something out, I'll post it here.