Mutual information of partitions

52 Views Asked by At

Let $X$ and $Y$ be random variables and $H$ be the information entropy. We have the following formula: $$ H((X,Y)) = H(X)+H(Y)-I(X,Y).$$ Instead of working with random variables we can consider the partitions they induce which we can call $\pi_X$, $\pi_Y$ and $\pi_{X,Y}$. Partitions are ordered by refinement and form a lattice. We have that $\pi_{X,Y}$ is the highest lowerbound to $\pi_X$ and $\pi_Y$, $$\pi_{X,Y}=\pi_X \land \pi_Y.$$

Question: is there any chance that $I(\pi_X,\pi_Y)=H(\pi_X \lor \pi_Y)$? We would then have the nice formula $$ H(\pi_X \land \pi_Y) = H(\pi_X)+H(\pi_Y)-H(\pi_X \lor \pi_Y).$$ Though i feel that it's too nice to be true. Thank you.

1

There are 1 best solutions below

0
On

What you are asking about, $H(\pi_X \vee \pi_Y)$, is typically called the "common information" in information theory. There is a nice review in Li and Chong, "On a Connection between Information and Group Lattices", Entropy, 2011.

It has been shown that common information is always less than, and typically much less than, the mutual information. See P. Gacs and J. Körner, “Common information is far less than mutual information”, Problems of Control and Information Theory, vol. 2, no. 2, pp. 119-162, 1972 .