Proof of inequality about mutual information

104 Views Asked by At

There is some joint distribution of random variables $(a, b, c, x, y)$. It is known that $I(a;b|c) = I(a;c|b) = I(b;c|a) = 0$ It is needed to prove this inequality: $I(a;b) \leq I(a;b|x) + I(a;b|y) + I(x;y)$ I have no idea what to do. I tried to use link between entropy and mutual information and chain rule but without any success. Please help. Thank you in advance.