Proof of an inequality with entropy and mutual information.

193 Views Asked by At

Entropy of a random variable (a) is (h) : $H(a) = h$.

Mutual information of (a) and (b) is (3h/4) : $I(a;b) = 3h/4$.

Mutual information of (a) and (c) is (3h/4) : $I(a;c) = 3h/4$.

It is needed to prove that $I(b;c) > h/2$

I tried to do it by a formula about link between mutual information and entropy: $I(a;b) = H(b) - H(b|a) = H(a) - H(a|b)$ but without any success. Could you help me to prove it please. Thank you in advance.

1

There are 1 best solutions below

0
On BEST ANSWER

enter image description here

$I(a;b) = \frac{3h}{4}$ and $I(a;c) = \frac{3h}{4}$.

Hence,

$I(a;b;c) \ge \frac{h}{2}$.

So if $I(b;c) \ne I(a;b;c)$ then $I(b;c) > \frac{h}{2}$

else $I(b;c) \ge \frac{h}{2}$.