Finding Conditional Probabilities, and Independence from Joint Probability Distribution Table

559 Views Asked by At

I have the following joint probability distribution table,

E E ~E ~E
G ~G G ~G
F a b c d
~F w x y z

a+b+c+d+w+x+y+z = 1

Given that F is True, I need to find,

  1. P(E|F,G)
  2. P(E|F,~G)
  3. P(G|F,E)
  4. P(G|F,~E)

To find these, is it just as simple as going into the table, and finding that value where E is true, for F is True and G is True (for the first one, and so on) which is "a", or is there a more complex calculation needed?

Other than this, I also need to find whether E and G are conditionally independent, given F. How do I do this?

1

There are 1 best solutions below

2
On

To find these, is it just as simple as going into the table, and finding that value where E is true, for F is True and G is True (for the first one, and so on) which is "a", or is there a more complex calculation needed?

It is just a bit more complex. That is the method to find the joint probability, $\mathsf P(E,F,G)=a$, but that is not entirely what you want.

You'll need to use the definition of conditional probability, and the Law of Total Probability:

$\qquad\mathsf P(F\mid E, G) = \dfrac{\mathsf P(E,F,G)}{\mathsf P(E,F,G)+\mathsf P(E,{\neg}F,G)}=\dfrac{a}{a+w}$

[Note: this is not one of yours, but they are solved similarly.]

Other than this, I also need to find whether E and G are conditionally independent, given F. How do I do this?

Find those conditional probabilities and you will have it. $E$ and $G$ are conditionally independent under $F$ exactly when $\mathsf P(E\mid F,G)=\mathsf P(E\mid F, \neg G)$, and this will also be when $\mathsf P(G\mid E,F)=\mathsf P(G\mid \neg E, F)$.