Bayesian Network and Conditional Independence

370 Views Asked by At

I have the following Bayesian Network

enter image description here

I'm given that

$P(q) = 0.8$
$P(s) = 0.7$

$P(h|q,s)=0.75$
$P(h|q,\bar s)=0.85$
$P(h|\bar q,s)=0.15$
$P(h|\bar q,\bar s)=0.3$

$P(g|q, h) = 0.75$
$P(g|q, ¬h) = 0.4$
$P(g|¬q, h) = 0.6$
$P(g|¬q, ¬h) = 0.3$

$P(e|s) = 0.7$
$P(e|¬s) = 0.5$

I'm trying to find $P(g|e)$ but I'm getting caught up using Baye's Theorem.

Using Baye's theorem I get

$P(g|e) = \frac{P(e|g)P(g)}{P(e)}$, but I get caught in a loop because I'm not sure how to solve $P(e|g)$

Can I say that because they g and e are conditionally independent, that $P(g|e) = g$?

I have found $P(g) = 0.61$, and $P(e) = .64$

1

There are 1 best solutions below

6
On BEST ANSWER

Can I say that because they g and e are conditionally independent, that $P(g|e)=P(g)$?

No.   That would indicate independence.   Rather, they are conditionally independent given $s$.


Using the notation that $s^1:=s, s^{-1}:=\neg s$ and therefore the Law of Total Probability can be represented as: $$\begin{align}p(e) &= p(e\mid s^1)p(s^1)+p(e\mid s^{-1})p(s^{-1})\\[1ex] & =\sum_\delta p(e\mid s^\delta)p(s^\delta)&&\text{implicitly: }\delta\in\{-1,1\}\end{align}$$

Then you can say that:

$$\begin{align}p(e,g) &= \sum_\alpha p(e,g\mid s^\alpha)p(s^\alpha) &&\text{LoTP} \\[1ex] &= \sum_\alpha p(e\mid s^\alpha)p(g\mid s^\alpha)p(s^\alpha) && \text{conditional independence}\end{align}$$

Now, use the LoTP to expand this expression into known probabilities and this find:$$p(g\mid e) = \dfrac{p(e,g)}{p(e)}$$