What is an example of two probability measures where X and Y are independent with respect to one and not the other?

477 Views Asked by At

Having some trouble conceptualizing this. Suppose we have an underlying set $\Omega$, a sigma algebra of subsets $F$, two random variables X and Y on $\Omega$, and two probability measures $P_1$ and $P_2$ where X and Y are independent with respect to $P_1$ but not $P_2$. What is an example of this? I'm honestly having trouble understanding the idea of being independent on one probability measure but not the other.

2

There are 2 best solutions below

0
On

Let $X$ and $Y$ be independent random variables with, say Poisson (1) distribution, $A=\{X=Y\}$ and $Q(E)=\frac {P(A\cap E)} {P(A)}$. Then $X=Y$ almost surely w.r.t. $Q$ so they are not independent w.r.t. $Q$. [It should be noted that $X$ and $Y$ are not constants w.r.t. $Q$. In fact they take all non-negative integer values with positive probability].

1
On

Suppose $X$ and $Y$ are discrete random variables taking values in finite spaces $\mathcal{X}$ and $\mathcal{Y}$ respectively. Then you can consider $\Omega = \mathcal{X} \times \mathcal{Y}$ with the discrete $\sigma$-algebra. Specifying a probability measure on $\Omega$ is essentially filling in a table of probabilities, i.e. specifying a probability for each pair $(x,y) \in \Omega$, and you write down explicitly what independence looks like. You can pick probabilities to satisfy the independence assumption, or you can pick them to violate the independence assumption.