Why is the conditional entropy $0$?

314 Views Asked by At

While reading the Elements of Information Theory I am stuck with this paragraph on page 35: enter image description here I cannot understand why $H(X|Y,Z)$ is $0$(any evidence hinting that $Y$ and $Z$ are independent to $X$?), and why $P(Z=0)H(X|Z=0)$ is also 0.

1

There are 1 best solutions below

2
On BEST ANSWER
  • If $Z=X+Y$ and you condition on both $Y$ and $Z$ (roughly speaking, "if you know $Y$ and $Z$"), then there is no uncertainty about $X$ (you "fully know it"): $$ X=Z-Y. $$

  • For the second one: looking at $H(X\mid Z=0)$, recall that $X,Y\in\{0,1\}$. So if $0=Z=X+Y$, then we must have $X=Y=0$: $X$ is then fully determined by the fact that $Z=0$, and there is no uncertainty there either. (Same thing for $Y$; so $H(X\mid Z=0)=H(Y\mid Z=0)=0$.)