Using Gibbs' inequality to prove $H(X,Y) \le H(X) + H(Y)$

155 Views Asked by At

I would like to prove

$$H(X,Y) \le H(X) + H(Y),$$

with $H$ being the entropy of the random variables $X, Y$. I know that I have to use the Gibbs' inequality, but I don't see how. Using the definition of the entropy, it is left to show that

$$-\sum_{x \in img(X), y \in img(Y)} p(X = x, Y = y) \log p(X = x, Y = y) \le -\sum_{x \in img(X)} p(X = x) \log p(X = x) - \sum_{y \in img(Y)} p(Y = y) \log p(Y = y).$$

In order to use Gibb's inequality, I would have to find a proper sum that equals $\sum_{x \in img(X), y \in img(Y)} p(X = x, Y = y)$, but I don't have any idea for that.