Suppose I have random variables $X, Y$ that are independent and identically distributed. If we denote the entropy by $H$, then does the following equality hold?
$$H(X+Y|Y) = H(X).$$
My intuition says yes since the only randomness in $X+Y$ that remains when $Y$ is observed is $X$. However, I struggle to prove this formally.
Background
We can use the relationship between $H(A|B)$ and $H(A|B=b)$. The latter is the entropy of $A$ conditioned on $B$ taking a particular value $b$
$$ H(A|B=b) = -\sum_a p(A=a|B=b) \log p(A=a|B=b)\tag1 $$
and the former is $H(A|B=b)$ averaged over all possible values of $B$
$$ H(A|B) = \sum_b p(B=b) H(A|B=b),\tag2 $$
see e.g. here for the equivalence of $(2)$ to the other common definition of $H(A|B)$.
Proof
The probability distribution of the random variable $X + y$ is the same as the probability distribution of $X+Y$ conditioned on $Y=y$, i.e.
$$ p(X+y=z) = p(X+Y=z|Y=y) $$
for any fixed $z$. Therefore, $H(X+Y|Y=y) = H(X+y)$. Moreover, $H(X)$ depends on the probabilities assigned by $X$ to the different possible values, but does not depend on those values. Therefore, $H(X + y) = H(X)$ for any fixed $y$.
Using these two observations and equation $(2)$, we have
$$ \begin{align} H(X+Y|Y) &= \sum_y p(Y=y) H(X+Y|Y=y) \\ &= \sum_y p(Y=y) H(X+y) \\ &= \sum_y p(Y=y) H(X) \\ &= H(X) \sum_y p(Y=y) \\ &= H(X). \end{align} $$