This question is related to this math.se question but I need a bit more guidance.
For two discrete random variables $X,Y$ we define their conditional entropy to be $$H(X|Y) = -\sum_{y \in Y} Pr[Y = y] ( \sum_{x \in X} Pr[X = x | Y = y] \log_2{Pr[X=x|Y=y]}).$$
I would like to show that $H(X|Y) = 0$ if and only if $X = f(Y)$ for some function $Y.$
Can someone help me with this? I don't know how the fact that $X,Y$ are related by a function implies $H(X|Y) = 0$ and vice versa.