Mutual Information and Entropy calculation

52 Views Asked by At

It is well known that Shannon's joint entropy ($H(X,Y)$) as well as mutual information ($I(X;Y)$) between two variables $X$ and $Y$ are non-negative based on Jensen's inequality. I read in a source that

$\begin{equation} I(X;Y)\div H(X,Y) \leq 1 \end{equation}$

However I don't understand why and can find anywhere any proof of that. Can someone explain it to me or cite a source for that?

1

There are 1 best solutions below

2
On BEST ANSWER

It is by definition, as per the below:

enter image description here

H(X,Y) := H(X|Y) + H(Y|X) + I(X;Y).

Please feel free to follow up if needing further assistance.