Proof of $H(Y|X)=H(Y)$ when X and Y are independent

2.2k Views Asked by At

Does anyone know how to prove the $H(Y|X)=H(Y)$ when X and Y are independent? I know the proof of $H(Y|X)=\sum p(x,y)\log_2p(y|x)$,but I found that I can't prove $H(Y|X)=H(Y)$ when $X$ and $Y$ are independent from this formula below.

I have try let $p(x,y)=p(x)p(y)$ and $\log \frac{p(x,y)}{p(x)}=\log_2p(x,y)-\log_2p(x)$ and try to let $H(Y|X)$ become $H(Y)$ if $X$ and $Y$ are independent,but unfortunately,i found it can't do that.

proof from https://en.wikipedia.org/wiki/Conditional_entropy

enter image description here

1

There are 1 best solutions below

2
On BEST ANSWER

Note that \begin{align} H(Y|X) & = \sum_{y\in\mathcal{Y},x\in\mathcal{X}} p(x,y)\log\left[\frac{p(x)}{p(x,y)}\right]\\ & = \sum_{y\in\mathcal{Y},x\in\mathcal{X}} p(x)p(y)\log\left[\frac{p(x)}{p(x)p(y)}\right]\\ & = \sum_{y\in\mathcal{Y}}p(y) \log\left[\frac{1}{p(y)}\right]\sum_{x\in\mathcal{X}} p(x)\\ & = \sum_{y\in\mathcal{Y}}p(y) \log\left[\frac{1}{p(y)}\right] \\ & = H(Y). \end{align}