How to solve probability with two conditions (with explanation)?

1.7k Views Asked by At

This is an extension over this question: Inter-causal reasoning: How to solve probability with two conditions?

I'm a beginner in probability, and trying to deeply understand what is happening underneath.

To sum up what the question is about:

We've got a graph of (binary) events:

$$ A \rightarrow C \leftarrow B $$

We're given probabilities of: $$ P(A), P(\bar A), P(B), P(\bar B) \\ P(C|A, B)\\ P(C|A, \bar B)\\ P(C|\bar A, B)\\ P(C|\bar A, \bar B) $$ Where $P(A)$ is a probability of occurrence of event $A$ and $P(\bar A)$ is a probability of event $A$ not occurring.


We have to find probability of: $P(B|C)$ and $P(B|C,A)$.

Before going further I'd like to say, that I'd like to find out a bit more things and, of course, be aware of theorems used.

I'll begin with really simple ones (numbering done to ease answering the questions):

  1. Does $P(B|C,A)$ means: Probability that event $B$ will occur given event $A$ and event $C$ occurred ?

  2. Are the events $A$ and $B$ independent? I see a V-structure in here, so we've got no active trail in here, right? So they're independent.

  3. We can write (Bayes theorem) that $P(B|C) = \frac{P(C|B)P(B)}{P(C)}$.

    3.1. To get $P(C|B)$ can we do conditioning and reduction on $B$ ?

    3.1.1. If yes, then does it equals to (why, what is the rule; my intuition says "B"): $$ Option~A\\ P(C|B) = P(C|B,A)+P(C|B,\bar A)\\ \\ Option~B\\ P(C|B) = P(C|B,A)P(A)+P(C|B,\bar A)P(\bar A) $$

    3.1.2. Does the (in)dependence of $A$ and $B$ affects somehow the way we can count $P(C|B)$ ?

    3.2. How can we count $P(C)$? Is it: $$ P(C) = ( P(C|A,B)+ P(C|A, \bar B) ) * ( P(C| \bar A,B) + P(C|\bar A, \bar B) ) $$

  4. Counting $P(B|C,A)$:

    4.1. I have counted it, but I cannot recall how, and I don't have my notes in here. It only means, that I didn't understood it, as I cannot do it again ;) I thought I can use a Bayes theorem in here, but will this turn out to be $\frac{P(C,A|B)P(C,A)}{P(B)}$ ? It doesn't look well.. And, can I use somehow the fact of (in)dependence of $A$ and $B$ in here?

    4.2. I know there's an answer in the connected question, but it's not about getting the answer. I want to understand how to figure out this answer.

1

There are 1 best solutions below

0
On BEST ANSWER
  1. Yes.
  2. Yes, due to the collider at $C$. It seems like you're already familiar with $d$-separation.
  3. Your statement about Bayes's Theorem is correct. I do not understand what you mean in 3.1. Option A is incorrect, and Option B is correct. Option B comes from $$P(C \mid B) = P(C \cap A \mid B) + P(C \cap \overline{A} \mid B),$$ which is due to marginalization. As for 3.1.2., I don't think you can exploit anything more about the independence of $A$ and $B$ when computing $P(C\mid B)$, but it shows up when calculating $P(C)$ (which again is by marginalization): \begin{align*}P(C)&=P(C \cap A \cap B) + P(C \cap \overline{A} \cap B) + P(C \cap A \cap \overline{B}) + P(C \cap \overline{A} \cap \overline{B})\\ &=P(C\mid A,B)P(A)P(B) + P(C\mid \overline{A},B)P(\overline{A})P(B) + P(C\mid A,\overline{B})P(A)P(\overline{B}) + P(C\mid \overline{A},\overline{B})P(\overline{A})P(\overline{B})\end{align*} The independence shows up in an intermediate step that I omitted: $P(A \mid B)P(B) = P(A)P(B)$.
  4. $\quad$ \begin{align*} P(B \mid A,C) &= \frac{P(C \mid A,B)\cdot P(B \mid A)}{P(C \mid A)} & \text{Bayes's Theorem}\\ &= \frac{P(C \mid A,B)P(B)}{P(C \mid A)} & \text{independence of $A$ and $B$} \end{align*} You can compute $P(C \mid A)$ in the same way as in #3 above.