Conditional shannon entropy : $H(X|Y)=0 \Rightarrow H(Y|X)=0$?

324 Views Asked by At

I would like to prove rigorously that $H(X|Y)=0 \Rightarrow H(Y|X)=0$ where $H$ is the shannon entropy, but I'm struggling.

First : I think it is correct because if $H(X|Y)=0$, it physically means that my variables $x$ and $y$ are fully correlated. But I'm stuck with the details of the proof I tried to make.

From the definitions, I have :

$$ H(X|Y)=\sum_{x,y} P_{X \cup Y} (x,y) \ln(\frac{P_Y(y)}{P_{X \cup Y}(x,y)})$$

$H(X|Y)=0$ means thus that :

$\forall (x,y) : P_{X \cup Y}(x,y)=0$ or $P_Y(y)=P_{X \cup Y}(x,y)$

To prove that $H(Y|X)=0$, I thus need to have :

$\forall (x,y) : P_{X \cup Y}(x,y)=0$ or $P_X(x)=P_{X \cup Y}(x,y)$

But I really don't find a way to end up with such result...

The thing I can find is that : As I must have $P_Y(y)=P_{X \cup Y}(x,y)$ when $P_{X \cup Y} (x,y) \neq 0$, it means that for $y$ fixed, there is only one $x$ for which $P_{X \cup Y} (x,y) \neq 0$. But I don't know if it is really usefull...

2

There are 2 best solutions below

2
On BEST ANSWER

This is false. Intuitively, it's because "$Y$ fully determines $X$" does not imply "$X$ fully determines $Y$."

Simplest counter-example I could come up with: $X$ is constant equal to $1$, while $Y$ is uniform on $\{0,1\}$.

Then $X$ and $Y$ are independent, so $$ H(X\mid Y) = H(X) = 0 $$ while $$ H(Y\mid X) = H(Y) = \ln 2\,. $$

0
On

The result is false. For instance if $Y$ is uniform on $\{1,2,3,4,5,6\}$ and $X$ is the residue of $Y$ mod $2$ then of course $Y$ determines $X$ so $H(X \mid Y)=0$. On the other hand both $H(Y \mid X=0)$ and $H(Y \mid X=1)$ are $\log_2(3)$, and $H(Y\mid X)$ is defined to be the average of these.