Conditionally IID random variables

3.4k Views Asked by At

Several books define what conditional independence means, and some of them go on to use the term "conditionally i.i.d. random variables", but i could not find a precise definition of what it means for random variables to be "conditionally identically distributed".

If $X_1$, $X_2$ are conditionally identically distributed given the $\sigma$-algebra $\mathcal{A}$, does it mean

  1. There is a regular version of $P\left(X_1\in\cdot\mid\mathcal{A}\right)$, $\kappa_1\left(B,\omega\right)$, and a regular version of $P\left(X_2\in\cdot\mid\mathcal{A}\right)$, $\kappa_2\left(B,\omega\right)$, such that for all $\omega$, $\kappa_1\left(\cdot,\omega\right)$ and $\kappa_2\left(\cdot,\omega\right)$ are the same probability measure? or

  2. If $\kappa_1\left(B,\omega\right)$ is any version of $P\left(X_1\in\cdot\mid\mathcal{A}\right)$ (regular or not) and $\kappa_2\left(B,\omega\right)$ is any version of $P\left(X_2\in\cdot\mid\mathcal{A}\right)$ (likewise), then for all $B$, $k_1\left(B,\cdot\right)=k_2\left(B,\cdot\right)$ a.s.?

2

There are 2 best solutions below

0
On BEST ANSWER

I'm inclined to believe the correct interpretation is #2. This is based on the following three considerations:

  1. Interpretation #2 is weaker than #1, so if a proposition involving conditionally identically distributed random objects is true under definition #2, it will be true under definition #1 too, but not necessarily vice versa. So it is more "efficient" to prove stuff under interpretation #2.

  2. The definition of conditional independence doesn't require assuming the existence of regular probability and the concept "conditionally identically distributed" seems to be closely related to the concept of "conditional independence" as i've only come across the former in the expression "conditionally IID random variables".

  3. The following question, for example, from [Schervish] (problem 4, chapter 1) can be solved using the 2nd definition: "Suppose that $\left\{X_n \right\}_{n=1}^\infty$ are conditionally IID given $Y$. Prove that they are exchangeable." This anecdotal evidence supports the conclusion that interpretation #2 is adequate and consistent with the way this concept is used in the literature.

Until i come across an example to the contrary, i will therefore assume interpretation #2.

6
On

You are handed a coin. With probability $1/2$ it's a fair coin. With probability $1/2$ it's a biased coin that gives you $90\%$ heads and $10\%$ tails. Given that you get "heads" the first six times, what's the conditional probability that you get "heads" the seventh time? It's pretty high, because the conditional probability that you got the biased coin, given that outcome, is high. In other words, obviously the outcomes of the tosses are not independent.

But the outcomes of the tosses are

  • conditionally i.i.d. given that you got the fair coin; and
  • conditionally i.i.d. given that you got the biased coin.

There's a conditional probability distribution of the sequence of outcomes, given that you got the fair coin. There's also a conditional probability distribution of the sequence of outcomes, given that you got the biased coin. In either of those distributions, you've got a sequence of i.i.d. outcomes.

So the outcomes are conditionally i.i.d. given the kind of coin you got.

More generally, suppose a coin gives you $100\cdot R\%$ heads, where $R$ is uniformly distributed between $0$ and $1$. The tosses are conditionally independent given $R$, but given that in the first trillion tosses you get $43\%$ heads, the probability that you get heads on the next toss is close to $0.43$, so the tosses are not independent. Nor even close to independent, as you see if you consider the conditional probability that the first outcome is heads, given that the second outcome is heads (which is $2/3$) versus the marginal probability that the first outcome is heads (which is $1/2$).