Properties of joint entropy

142 Views Asked by At

I'm trying to show the following, but am stuck

For discrete random variables $X$ and $Y$, show $$H(X,Y)\geq \max\{H(X),H(Y)\}$$ where $H$ represents entropy

2

There are 2 best solutions below

0
On

Hint:

You should show that $H(X,Y) \ge H(X)$ and $H(X,Y) \ge H(Y)$

$P(X=x_i)=\sum \limits_{j=1}^{n}P(x_i,y_j)$

$H(X,Y)-H(X) = (-\sum\limits_{i,j}P(x_i,y_j)\log P(x_i,y_j))-(-\sum\limits_{i}P(x_i)\log P(x_i))\\ =(-\sum\limits_{i,j}P(x_i,y_j)\log P(x_i,y_j))-(-\sum\limits_{i,j}P(x_i,y_j)\log P(x_i))\\=-\sum\limits_{i,j}P(x_i,y_j)(\log P(x_i,y_j)-\log P(x_i))$

4
On

By standard properties

$$H(X,Y) = H(X) + H(Y\mid X) = H(Y) + H(X\mid Y)$$

Assume that $\max\{H(X),H(Y)\} = H(X)$. Then you require

$$H(X,Y) \ge \;H(X) \iff H(X) + H(Y\mid X) \ge \;H(X) \iff H(Y\mid X) \ge\; 0$$

which holds true since Entropy (related to discrete random variables, and in contrast to Differential Entropy that refers to continuous random variables), is always positive.

An analogous result will obtain if you assume $\max\{H(X),H(Y)\} = H(Y)$. QED