Proof of a fact about mutual information and entropy

279 Views Asked by At

It is needed to prove that for distribution (a, b, c) such that $I(a;b|c) = I(a;c|b) = I(b;c|a) = 0$ exists a random variable $d$ such that $H(d) = I(a, b, c)$ and $a,b,c$ are independent with respect to $d$. In other words I think it is equal to the possibility of "extracting" mutual information of $a,b,c$. I do not know what to do, I tried to use the chain rule and link between mutual information and entropy, but without any success. Please help me, thank you in advance.