Differential entropy for joint distribution, bounded from below by the maximum of the marginals?

37 Views Asked by At

For two discrete random variables $X_1,X_2$, the joint entropy is bounded from below by the maximum of the two marginals :

$$H(X_1,X_2)\geq \text{max}_i H(X_i),$$

this can be seen since $H(X_1,X_2)= H(X_1)+H(X_2|X_1)$, where $H(\cdot|\cdot)$ is the conditional entropy, defined as

$$ H(X_2|X_1)=-\sum_{x_1,x_2} P(x_1,x_2)\log\Big(\frac{P(x_1,x_2)}{P(x_1)}\Big)\geq 0, $$

where $P(x_1,x_2)=\mathbb{P}(X_1=x_1,X_2=x_2)$ and $P(x_1)=\mathbb{P}(X_1=x_1)$.

Is there a similar inequality for two continuous random variables $Y_1,Y_2$ and differential entropy $$ H(Y_1,Y_2) \geq \text{max}_i H(Y_i)~?$$

1

There are 1 best solutions below

1
On BEST ANSWER

No. Trivial counterxample: let $X_1,X_2$ be iid uniform on $[0,\frac12]$

Then $h(X_1,X_2) = -2 $ bits and $h(X_1)= -1$ bits

In general: it's still true (chain rule) that:

$$h(X_1,X_2) = h(X_1) + h(X_2|X_1)$$

but, of course, $h(X_2|X_1)$ can be negative. Hence, there's little more to say.