Conditional Probability Semantics

161 Views Asked by At

I am studying conditional probability, and have a question on the semantics of a problem.

I have the following belief network:

P(B) = 0.01
P(E) = 0.02
P(A|B ∧ E) = 0.95
P(A|B ∧ ¬E) = 0.94
P(A|¬B ∧ E) = 0.29
P(A|¬B ∧ ¬E) = 0.001
P(J|A) = 0.9
P(J|¬A) = 0.05
P(M|A) = 0.7
P(M|¬A) = 0.01

For:

B = Burglary
E = Earthquake
A = Alarm Rings
J = Jack Calls
M = Marie Calls

I am shown that to calculate the probability of both Marie and Jack calling when the alarm is ringing but neither a burglary nor an earthquake has occurred I need to do:

P(J|A)P(M|A)P(A|¬B ∧ ¬E)P(¬B)P(¬E)

However, I'm unsure how this translates. I am taking the probability of both Jack and Marie calling if the alarm rings, and the probability that the alarm rings given that there is neither an earthquake nor a burglary. However, I don't get why I then need to account for the probabilities of there not being a burglary and there not being an earthquake.

Doesn't me already including the term P(A|¬B ∧ ¬E) imply ¬B and ¬E? Wouldn't not implying this mean that P((B ∨ E)|(¬B ∧ ¬E)) > 0? Essentially, wouldn't me explicitly including the probabilities of there not being an earthquake or burglary mean that there is a possibility for the event "alarm rings without burglary or earthquake" occurring when "earthquake" or "burglary" be true?

Thanks! :)

1

There are 1 best solutions below

0
On

calculate the probability of both Marie and Jack calling when the alarm is ringing but neither a burglary nor an earthquake.

This means the probability sought is $P(J,M,A,\neg B,\neg E)$, with comma denoting "and" or "intersection". By the Chain Rule:

\begin{eqnarray*} P(J,M,A,\neg B,\neg E) &=& P(J \vert M,A,\neg B,\neg E) P(M \vert A,\neg B, \neg E) P(A \vert \neg B,\neg E) P(\neg B \vert \neg E) P(\neg E) \\ && \\ &=& P(J \mid A) P(M \mid A) P(A \mid \neg B,\neg E) P(\neg B) P(\neg E). \end{eqnarray*}

That last step is due to the inherent independence and conditional independence built into the system. E.g. given $A$, events $J$ and $M$ are independent. Events $B$ and $E$ are (unconditionally) independent. These are assumptions made when constructing the system, presumably to match with reality.

The Chain Rule is actually a series of steps combined into one. It might help to break it down:

\begin{eqnarray*} P(J,M,A,\neg B,\neg E) &=& P(J \mid M,A,\neg B,\neg E) P(M,A,\neg B, \neg E) \\ && \\ &=& P(J \mid M,A,\neg B,\neg E) P(M \mid A,\neg B, \neg E) P(A,\neg B,\neg E). \end{eqnarray*}

So one of the terms we need to evaluate is $P(A,\neg B,\neg E)$. Continuing with the Chain Rule:

\begin{eqnarray*} P(A,\neg B,\neg E) &=& P(A \mid \neg B,\neg E) P(\neg B,\neg E) \\ && \\ &=& P(A \mid \neg B,\neg E) P(\neg B \mid \neg E) P(\neg E). \\ && \\ &=& P(A \mid \neg B,\neg E) P(\neg B) P(\neg E) \qquad\text{(by independence)}. \\ \end{eqnarray*}

Doesn't me already including the term P(A|¬B ∧ ¬E) imply ¬B and ¬E?

No, it's saying if they happened. The "they did happen" is done by multiplying by $P(\neg B,\neg E)$. Otherwise, what you're really saying is that $P(A,\neg B,\neg E)$ and $P(A \mid \neg B,\neg E)$ are the same thing.

Maybe one final way to illustrate the difference. The event $A\cap B \cap E$ is extremely unlikely. It has probability $0.95 \times 0.01 \times 0.02 = 0.00019$. However, $P(A\mid B \cap E)$ is very likely at $0.95$.

I hope that helps.