Conditional entropy of a variable by itself

1k Views Asked by At

Let $H( \cdot | \cdot )$ denote the conditional entropy of a random variable. Proof that $H(X|X) = 0$

Please avoid intuitive arguments. I do have the intution but I'm missing the details of the proof.

2

There are 2 best solutions below

3
On BEST ANSWER

Assuming that $X$ and $Y$ are discrete with joint pmf $p_{X,Y}(x,y)$ and marginal pmfs $p_X(x)$ and $p_Y(y)$, $$ H(Y\mid X)=\sum_{x\in\mathcal{X}}\sum_{y\in\mathcal{Y}}p_{X,Y}(x,y)\ln\frac{p_X(x)}{p_{X,Y}(x,y)} $$

If $Y=X$ (so that $\mathcal{X}=\mathcal{Y}$), $p_{X,X}(x,y)=p_X(x)$ for $x=y$ and $p_{X,Y}(x,y)=0$, otherwise. Thus, $$ H(X\mid X)=\sum_{x\in\mathcal{X}}p_X(x)\ln\frac{p_X(x)}{p_X(x)}=\sum_{x\in\mathcal{X}}p_X(x)\times 0=0. $$

0
On

$H(X|Y)=\sum_{y} P_Y(y) H(X|Y=y)$ by definition. Each of the conditional distributions $P(X|Y=y)$ have zero entropy, if we define them such that they put all the probability mass at $X=y.$ Thus $H(X|Y)=0,$ being a linear combination of zero values, namely $H(X|Y=y)$'s.