I have two similar questions:
1)Does exist a distribution of three random variables such that:
$I(a:b) = 0$ and $I(a:b|c)>0$ (where $I(a:b)$ is a mutual information and $I(a:b|c)$ is a conditional mutual information)?
2)Does exist a distribution of three random variables such that:
$I(a:b) > 0$ and $I(a:b|c)=0$ (where $I(a:b)$ is a mutual information and $I(a:b|c)$ is a conditional mutual information)?
About (1) question I have no idea, about (2) question I think it is possible only when we have a Markov-chain $a-c-b$.
Am I right about (2) question and how to find answer for (1) question? Please help. Thank you in advance.
The classical example of the first case is $ b = a + c$ where $a$ and $c$ are independent fair Bernoulli variables, $P(a=1)=P(a=0)=1/2$, and the sum is modulo 2 (XOR). Then $I(a;b)=0$ (knowing one of the inputs tells you nothing about the output) and $I(a;b|c)=1$ (given that one of the inputs is know, to know the other is to know the output).
For the second, you are right about the Markov chain example.
In general, $I(X;Y)=0$ iff $X,Y$ are independent; hence $I(a;b|c)=0$ iff $a,b$ are conditionally independent with respect to $c$. Then
$$I(a;b|c)=0 \iff p(a,b|c)=p(a|c) \, p(b|c)\\ \iff p(a,b,c) p(c)=p(a,c) p(b,c)\\ \iff \frac{p(a,b,c)}{p(b,c)}=\frac{p(a,c)}{p(c)}\\ \iff p(a|b,c)=p(a|c)$$ Hence, $b \to c \to a $ form a Markov chain.