How to show the conditional entropy $H(Y|X)$ is equal to $H(X+Y|X)$?

637 Views Asked by At

I have a random variable $X$ that takes the values $x_1,x_2,...,x_4$ and $Y$ takes the values $y_1,y_2,...,y_s$.I define $Z=X+Y$ , and I'm trying to show that $H(Z|X)=H(Y|X).$

I know that $$P(Z=z)=\sum_{x\in{X}}{P(X=x)\cdot{P(Y=z-x)}}$$ but I'm not sure how to expand $$H(Y|X)=\sum_{x\in{X},y\in{Y}}p(x,y)log_2\frac{p(x)}{p(x,y)}$$ to $H(Z|X)$.

2

There are 2 best solutions below

14
On BEST ANSWER

Take your definitions:$$\begin{align}\mathsf H(Z\mid X)&=\sum_{x,z}p_{\small X,Z}(x,z)\log(p_{\small X}(x)/p_{\small X,Z}(x,z))\\[1ex]&=\sum_{x,y} p_{\small X,Z}(x,x+y)\log(p_{\small X}(x)/p_{\small X,Z}(x,x+y))\\[2ex]\mathsf H(Y\mid X)&=\sum_{x,y} p_{\small X,Y}(x,y)\log(p_{\small X}(x)/p_{\small X,Y}(x,y))\end{align}$$

Now consider that for discrete random variables, $X,Y$, and $Z=X+Y$ we have : $$p_{\small X,Z}(x,x+y)=p_{\small X,Y}(x,y)$$

0
On

Alternatively: you know the chain rule : $H(X,Z)=H(Z)+H(X \mid Z) = H(X)+H(Z\mid X)$. The same equation is valid conditioned on a third variable $Y$

$$ H(Z\mid Y) +H(X\mid Z,Y) = H(X\mid Y)+H(Z\mid X,Y) $$

Now, let $Z=X+Y$

We have $H(Z\mid X,Y)=0$ , because $Z=g(X,Y)$

But we also have $H(X\mid Z,Y) =0$ for the same reason (given $Z,Y$ we know $X$).

Hence $$ H(X+Y\mid Y) = H(X\mid Y)$$

Actually this proves a more general result:

$$ H(g(X,Y) \mid Y) = H(X\mid Y)$$ for any function $g$ that is invertible wrt the first argument; i.e $g(x_1,y) = g(x_2,y) \implies x_1 = x_2$.