The inequality holds,
$$I(X;Y) \leq \min \{ \log| \mathcal X |, \log| \mathcal Y | \}$$
where $I(X;Y)$ is the mutual information. I know that
$H(X) \leq \log| \mathcal X| $ is an upper bound on entropy, which can be prooved by $\log | \mathcal X|-H(X) \geq 0 $.
And $I(X;Y) = H(X)+H(Y)-H(X,Y)$ by definition.
Now my question is, how do I prove the above inequality? Besides, is there any physical meaning to it?
2026-03-30 16:26:38.1774887998
Proof and physical meaning of $I(X;Y) \leq \min \{ \log| \mathcal X |, \log| \mathcal Y | \}$
86 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
1
Starting with: $I(X;Y) = H(X)+H(Y)-H(X,Y)$ as you did, i used the identity $H(X,Y)=H(X)+H(X|Y)=H(Y)+H(Y|X)$ and got $I(X;Y) = H(Y)-H(X|Y)=H(X)-H(Y|X)$.
I can now enlarge both expressions by using your inequalities and the fact that $H(Z)\ge 0$ and get: $ I(X;Y) \le log(|\mathcal Y|) $ and $ I(X;Y) \le log(|\mathcal X|) $.
In order to get an "always true" inequality, I will have to pick the smallest of the two, and get $ I(X;Y) \le min{log(|\mathcal X|),log(|\mathcal Y|)} $