Understanding how to read Bayesian networks

61 Views Asked by At

Below is an example that I want to talk about: enter image description here

I'm going to define variable names based on the first letter as described in the bubbles. One question I have is how would I calculate $P(M|B)$? This is what I got so far: $$P(M|B) = \dfrac{P(M,B)}{P(B)}$$ Since $P(B)$ is known then I move on to the numerator: $$P(M,B) = P(M,B,J,E,A)+P(M,B,\neg J, E,A)+P(M,B,J,\neg E,A)+P(M,B,J, E,\neg A)+P(M,B,\neg J,\neg E,A)+P(M,B,\neg J, E,\neg A)+P(M,B,J, \neg E,\neg A)+P(M,B,\neg J, \neg E,\neg A)$$ Is this really the way? I'm surprised to see how much computation I have to do for such a small problem. Am I doing this correctly?

2

There are 2 best solutions below

0
On

Well, the joint probability distribution is $$p(B,E,A,J,M) = p(B) p(E) p(A\mid B,E) p(J\mid A) p(M\mid A).$$ What your are looking for is the marginal distribution $$p(M,B) = \sum_{E,A,J} p(B,E,A,J,M),$$ where the sum is over all values of $E,A,J$ and this is what you did right!

0
On

You are correct. Bayesian Networks do have a computation load. Still, you were on the correct path.

However, you could have simplified it slightly, by noting that the child node "JohnCalls" is not needed in the factorisation of $p(M,B)$. (It gets "summed out" by law of total probability.)

$$\begin{align}p(\mathrm B,\mathrm M)&=\sum_{A,E,J} p(\mathrm B,E,A,J,\mathrm M)\\[1ex]&=\sum_{A,E,J}p(\mathrm B)p(E)p(A\mid\mathrm B,E)p(J\mid A)p(\mathrm M\mid A)\\[1ex]&=p(\mathrm B)\sum_A p(\mathrm M\mid A)\sum_E p(A\mid\mathrm B,E)p(E)\sum_J p(J\mid A)\\[1ex]&=p(\mathrm B)\sum_A p(\mathrm M\mid A)\sum_E p(A\mid\mathrm B,E)p(E)\\[4ex]p(\mathrm M\mid \mathrm B)&=\sum_A p(\mathrm M\mid A)\sum_E p(A\mid\mathrm B,E)p(E) \\[1ex]&=p(\mathrm M\mid A)\big(p(A\mid\mathrm B,E) p(E)+p(A\mid\mathrm B,\neg E)p(\neg E)\big)+p(\mathrm M\mid\neg A)\big(p(\neg A\mid\mathrm B,E)p(E)+p(\neg A\mid\mathrm B,\neg E)p(\neg E)\big)\end{align}$$