Mutual information comparison

90 Views Asked by At

$X, Y, Z$ are three discrete random variables. How can we compare the following quantities?

$I(X,Y ; Y,Z)$ and $I(X; Y,Z)$?

I know that $I(X;Y) \le \min \{H(X), H(Y)\}$ and since $H(X)$ is less or equal to $H(X,Y)$,

can we conclude that $I(X,Y ; Y,Z) \ge I(X; Y,Z)$?

1

There are 1 best solutions below

0
On

Definitely. We have $$I(X,Y;Y,Z)=I(X;Y,Z)+I(Y;Y,Z|X)=I(X;Y,Z)+H(Y|X)$$ and note that entropy and mutual information is always non-negative for discrete random variables.