It is well known that Shannon's joint entropy ($H(X,Y)$) as well as mutual information ($I(X;Y)$) between two variables $X$ and $Y$ are non-negative based on Jensen's inequality. I read in a source that
$\begin{equation} I(X;Y)\div H(X,Y) \leq 1 \end{equation}$
However I don't understand why and can find anywhere any proof of that. Can someone explain it to me or cite a source for that?
It is by definition, as per the below:
H(X,Y) := H(X|Y) + H(Y|X) + I(X;Y).Please feel free to follow up if needing further assistance.