Let $H( \cdot | \cdot )$ denote the conditional entropy of a random variable. Proof that $H(X|X) = 0$
Please avoid intuitive arguments. I do have the intution but I'm missing the details of the proof.
Let $H( \cdot | \cdot )$ denote the conditional entropy of a random variable. Proof that $H(X|X) = 0$
Please avoid intuitive arguments. I do have the intution but I'm missing the details of the proof.
Assuming that $X$ and $Y$ are discrete with joint pmf $p_{X,Y}(x,y)$ and marginal pmfs $p_X(x)$ and $p_Y(y)$, $$ H(Y\mid X)=\sum_{x\in\mathcal{X}}\sum_{y\in\mathcal{Y}}p_{X,Y}(x,y)\ln\frac{p_X(x)}{p_{X,Y}(x,y)} $$
If $Y=X$ (so that $\mathcal{X}=\mathcal{Y}$), $p_{X,X}(x,y)=p_X(x)$ for $x=y$ and $p_{X,Y}(x,y)=0$, otherwise. Thus, $$ H(X\mid X)=\sum_{x\in\mathcal{X}}p_X(x)\ln\frac{p_X(x)}{p_X(x)}=\sum_{x\in\mathcal{X}}p_X(x)\times 0=0. $$