Independence of Random Variables and Almost Sure Equivalence Between 3 Random Variables

90 Views Asked by At

Let $X, Y, Z:\Omega\longrightarrow \mathbb{R}$ be random variables on a probability space $(\Omega,\mathscr{F},\mathbb{P})$ such that $X$ is independent of $Y$ and $Y = Z$ almost surely/everywhere.

Is it true that, $X$ would also be independent of $Z$?...And if not, are there any further conditions/assumptions that this may then hold true?

I have been trying to prove this for a while because I couldn't come up with a simple counter example (I must admit I am not too great at creating them).

If true, a proof preferably using the $\sigma$-algebra definition of independence between random variables would be amazing as this is something I feel is easier for me to understand, although any other arguement would still be great. Many thanks in advanced.