Crash course in conditional probability?

65 Views Asked by At

I've taken some probability classes before, but it's been a while. I'm looking for (1) some resource -- website, paper, software, book -- etc on conditional probabilities. Something which accessible to a junior/senior level undergrad. Specifically, I'm interested in the relationships described by the following example:

The nodes $A$,$B$ are independent, and node $C$ depends on both (I understand this may be an example of what's called a directed acyclic graph, if that is important). Let $P(A) = 0.3$, let $P(B) = 0.01$, and the let the probability of $C$ given $A$ and $B$ are true be $P(C|A,B) = 0.5$. So find $P(C)$. I believe (please correct me) that

$$P(C) = P(C|A,B)P(A)P(B) = 0.0015$$

or something like that.

That may be wrong, but that's what I'm interested in, and how to generalize these dependencies to a larger graph. And (2) -- this something that might be too complicated for a crash course -- I'm also interested in what happens if say, $A$ and $B$ in the example above were both dependent on each other, breaking the "directed acyclicness" of the graph.

Also, comment on notation. Thank you!


Edit: I was way off in how I understood what I was looking for. The type of dependencies associated with the nodes I described above rely on truth tables. For example, $A$'s probability distribution:

$$ D(A) = \begin{array}{c|c} \text{True} & \text{False} \\ \hline 0.3 & 0.7 \end{array}, \quad D(B) = \begin{array}{c|c} \text{True} & \text{False} \\ \hline 0.01 & 0.99 \end{array} $$

And $C$'s (in percentages)

enter image description here

Then without observable evidence, the probability that $C$ is true is $ \approx 0.9688$. How does one arrive at that value?