Suppose we have the following scenario:
And I want to tell if $B$ and $E$ are independent.
It looks like there are independent because it seems that they don't have a common parent, but I need more exact explanation.
Suppose we have the following scenario:
And I want to tell if $B$ and $E$ are independent.
It looks like there are independent because it seems that they don't have a common parent, but I need more exact explanation.
On
You might use the Law of Total Probability to verify that the definition of independence holds.
$$\begin{align}\mathsf P(E,B)&=\mathsf P(E,A,B)+\mathsf P(E,A^\complement,B)&&\text{Law of Total Probability}\\[1ex]&=\mathsf P(E)~\mathsf P(A\mid E,B)~\mathsf P(B)+\mathsf P(E)~\mathsf P(A^\complement\mid E,B)~\mathsf E(B)&&\text{Factorisation via the DAG}\\[1ex]&=\mathsf P(E)~\left(\mathsf P(A\mid E,B)+\mathsf P(A^\complement\mid E,B)\right)~\mathsf P(B)&&\text{Distribution of common factors}\\[1ex]&=\mathsf P(E)~(1)~\mathsf P(B)&&\text{Probability of Complements}\end{align}$$
They are not independent, since they have a common 'child'.
Assume they are independent, then: \begin{align*} \mathbb{P}(B) &= \mathbb{P}(B|E) = \mathbb{P}(B|E,A)\mathbb{P}(A|E) + \mathbb{P}(B | E,A^c)\mathbb{P}(A^c | E). \end{align*} Now \begin{align*} \mathbb{P}(B|E,A) &= \frac{\mathbb{P}(B,E,A)}{\mathbb{P}(E,A)} = \frac{\mathbb{P}(A|B,E) \mathbb{P}(B)\mathbb{P}(E)}{\mathbb{P}(A|E)\mathbb{P}(E)} = \frac{\mathbb{P}(A|B,E) \mathbb{P}(B)}{\mathbb{P}(A|E)} \end{align*} In the same way we can rewrite $\mathbb{P}(B|E,A^c)$, which yields: \begin{align*} 0.001 &= \mathbb{P}(B) = \mathbb{P}(B|E) = \mathbb{P}(A|B,E)\mathbb{P}(B) + \mathbb{P}(A^c|B,E)\mathbb{P}(B) = 0.95 \cdot 0.001 + 0.001\cdot 0.001, \end{align*} which gives a contradiction.