Implications of independence among random variables

107 Views Asked by At

Consider the real-valued random variables $X,Y,Z,W,A$.

Suppose that $Z,W,A$ are mutually independent and that $Y:=g(Z,W)$ for some Borel measurable function $g:\mathbb{R}^2\rightarrow \mathbb{R}$.

Assume that $X$ has the same distribution as $Y$.

Does this imply that

(1) The distribution of $X$ conditional on $W$ is equivalent to the distribution of $g(Z,W)$ conditional on $W$? Could you give some explanations on why yes or why no and whether the independence of $Z$ from $W$ is used?

(2) The distribution of $X$ conditional on $(W,A)$ is equivalent to the distribution of $g(Z,W)$ conditional on $(W,A)$? Could you give some explanations on why yes or why no and whether the independence of $A$ from $(Z,W)$ is used?

1

There are 1 best solutions below

8
On BEST ANSWER

Assertions (1) and (2) need not be true given your assumptions. The reason is that knowing only that $X$ has the same distribution as $Y$ says nothing about the joint distribution of $X$ with $(Z,W,A)$.

An example where (1) and (2) are false: Say $Z$, $W$, $A$ are mutually independent coin tosses, each taking value $0$ or $1$ with equal probability. Let $Y:= Z+W$, and let $X$ have the same distribution as $Y$ (so $X$ has Binomial($n=2,p=1/2$) distribution), but independent of $(Z,W,A)$. Then $$P(X=0\mid W=1) = P(X=0)=\textstyle\frac14$$ while $$P(Y=0\mid W=1)=P(Z+W=0\mid W=1)=0.$$ Similarly $P(X=0\mid W=1, A=1)=\frac14$ while $P(Y=0\mid W=1, A=1)=0$. The independence of $Z$ from $W$, or the independence of $A$ from $(Z,W)$ doesn't come into play.