If H is the entropy function (information theory), then what is a simple way to prove that
$H(X, Y) + H(Y, Z) \geq H(X,Z) + H(Y)$
for all random variables $X, Y, Z$?
If H is the entropy function (information theory), then what is a simple way to prove that
$H(X, Y) + H(Y, Z) \geq H(X,Z) + H(Y)$
for all random variables $X, Y, Z$?
\begin{align*} H(X,Y) + H(Y,Z) &= \overbrace{H(Y) + H(X|Y)} + H(Y,Z) \\ &\geq H(Y) + \underbrace{H(X|Y,Z) + H(Y,Z)} \\ &= H(Y) + H(X,Y,Z) \\ &\geq H(Y) + H(X,Z) \end{align*}