Verifying an equation involving entropy and mutual information

49 Views Asked by At

I would like to show the following:

Consider a joint distribution of $X,Y$ with density $p(x,y)$. For arbitrary functions $f,g:\mathbb{R}\to\mathbb{R}$ we have:

$$ H(X) + H(Y-f(X)) = H(Y) + H(X-g(Y)) - I(X-g(Y),Y) + I(Y-f(X),X)$$

where $H$ is the entropy and $I$ is the mutual information.

"Expanding" the right hand side, I got

$$H(Y) + H(X-g(Y)) - H(X-g(Y)) - H(Y) + H(X-g(Y),Y) + H(Y-f(X)) + H(X) - H(Y-f(X),X)$$

which simplies to

$$H(X) + H(Y-f(X)) + \color{red}{H(X-g(Y))+H(Y|X-g(Y)) - H(Y-f(X))-H(X|Y-f(X))}$$

What properties should I use to get rid of the red part?

1

There are 1 best solutions below

4
On BEST ANSWER

$$H(Y) + H(X-g(Y)) - H(X-g(Y)) - H(Y) + H(X-g(Y),Y) + H(Y-f(X)) + H(X) - H(Y-f(X),X)$$

The first 4 terms cancel each other out and it becomes:

$$H(Y-f(X)) + H(X) + H(X-g(Y),Y) - H(Y-f(X),X)$$

However,

\begin{align} H(X-g(Y),Y)&=H(Y)+H(X-g(Y)|Y)\\ &=H(Y)+H(X) \end{align}

and

\begin{align} H(Y-f(X),X)&=H(X)+H(Y-f(X)|X)\\ &=H(X)+H(Y) \end{align}

Hence,

$$H(X-g(Y),Y) - H(Y-f(X),X)=H(Y)+H(X)-\left(H(X)+H(Y)\right)=0$$

and it becomes

$$H(Y-f(X)) + H(X)$$

which is the LHS.