I would like to prove rigorously that $H(X|Y)=0 \Rightarrow H(Y|X)=0$ where $H$ is the shannon entropy, but I'm struggling.
First : I think it is correct because if $H(X|Y)=0$, it physically means that my variables $x$ and $y$ are fully correlated. But I'm stuck with the details of the proof I tried to make.
From the definitions, I have :
$$ H(X|Y)=\sum_{x,y} P_{X \cup Y} (x,y) \ln(\frac{P_Y(y)}{P_{X \cup Y}(x,y)})$$
$H(X|Y)=0$ means thus that :
$\forall (x,y) : P_{X \cup Y}(x,y)=0$ or $P_Y(y)=P_{X \cup Y}(x,y)$
To prove that $H(Y|X)=0$, I thus need to have :
$\forall (x,y) : P_{X \cup Y}(x,y)=0$ or $P_X(x)=P_{X \cup Y}(x,y)$
But I really don't find a way to end up with such result...
The thing I can find is that : As I must have $P_Y(y)=P_{X \cup Y}(x,y)$ when $P_{X \cup Y} (x,y) \neq 0$, it means that for $y$ fixed, there is only one $x$ for which $P_{X \cup Y} (x,y) \neq 0$. But I don't know if it is really usefull...
This is false. Intuitively, it's because "$Y$ fully determines $X$" does not imply "$X$ fully determines $Y$."
Simplest counter-example I could come up with: $X$ is constant equal to $1$, while $Y$ is uniform on $\{0,1\}$.
Then $X$ and $Y$ are independent, so $$ H(X\mid Y) = H(X) = 0 $$ while $$ H(Y\mid X) = H(Y) = \ln 2\,. $$