Joint probability when variables are partially independent

143 Views Asked by At

There are three variables $C$, $I$, and $\theta$. $C$ and $I$ are independent. Is $$P(C, I, \theta) = P(C|\theta)P(I|\theta)P(\theta)$$ correct? I derived this from the chain-rule.

$$P(C, I, \theta) = P(C|I,\theta)P(I|\theta)P(I\theta).$$ I wonder if $P(C|I,\theta) = P(C|\theta)$ stands when $C$ and $I$ are independent.

2

There are 2 best solutions below

0
On

A counter example

If I understand the question well then we have here events: $C,I$, and $\theta$. Let the universal set be $[0,1]\times[0,1]$ and let $C,I,\theta$ be defined by the following figure:

enter image description here

Here $C,I,\theta$ are events, but one can consider their characteristic functions to be random variables.

Assume that the probability measure equals the area measure over the unit square.

First, $I\cap C\cap \theta$= the lover left grey triangle, so $$P(I\cap C\cap \theta)=\frac18.$$

Second, $C$ and $I$ are independent...

Next, compute $$P(C\mid \theta)=\frac{P(C\cap \theta)}{P(\theta)}=\frac{\frac18}{\frac12}=\frac14.$$ Then,

$$P(I\mid \theta)=\frac{P(I\cap \theta)}{P(\theta)}=\frac{\frac18+\frac14}{\frac12}=\frac34.$$

Now

$$\frac18=P(I\cap C\cap \theta)\not=P(C\mid\theta)P(I\mid\theta)P(\theta)=\frac14\cdot\frac34\cdot\frac12.$$

0
On

No.

The classic counter-example would be something like having the following equally likely, with each combination having probability $\frac14$:

C   I  theta
0   0    0
0   1    1
1   0    1
1   1    0

Then $C$ and $I$ are independent of each other (and indeed each is pair-wise independent of $\theta$, but together they are not)

So $P(C=1\mid \theta=1)=\frac12$ but $P(C=1\mid I=1, \theta=1)=0$

and $P(C=1, I=1, \theta=1)=0 \not = \frac18=P(C=1, \mid \theta=1)P(I=1,\mid \theta=1)P( \theta=1)$