An Inequality In Information Theory

90 Views Asked by At

I know that for discrete random variables $X$ and $Y$,

$H(X,Y) \geq H(X)$ and $H(X,Y) \geq H(Y)$

My question is, given that $A and B$ are continuous random variables, is there any continuous counterpart of this inequality, that is,

$h(A,B) \geq h(A)$ and $h(A,B) \geq h(B)$

1

There are 1 best solutions below

2
On

By the chain rule one gets \begin{align} h(A,B)= h(A)+h(A|B). \end{align} However, note that $h(A|B)$ might be negative. Therefore, the inequality \begin{align} h(A,B)\ge h(A) \end{align} holds if and only if $h(A|B) \ge 0$.

An example of $h(A|B) <0$ can be constructed as follows: \begin{align} A=B+W \end{align} where $B$ is independent of $W$ and $W \sim \mathcal{N}(0,\sigma^2)$. Therefore, \begin{align} h(A|B)=h(W)=0.5 \log (2 \pi e \sigma^2), \end{align} to make $h(A|B)$ negative take $\sigma^2<\frac{1}{2 \pi e}$.