Doing a course of cryptography I have been asked to prove the following: $H(X,Y) = H(Y) +H(X|Y)$.
But I simply do not know where to start, so a hint in the right direction would be very much appreciated. I have tried writing up the sums but that did not get me anywhere - might be missing something.
Starting with the definition of differential entropy (assuming the random variables are continuous) we have: $$H(Y)=-\int_{\mathbb{Y}} p(y)\log p(y)\ dy$$ and the following expression for the conditional entropy $$\begin{align} H(X|Y)&=-\int_{\mathbb{Y}}\int_{\mathbb{X}} p(x,y)\log p(x|y)\ dx\ dy\\&=\int_{\mathbb{Y}}\int_{\mathbb{X}} p(x,y)[\log p(y)-\log p(x,y)]\ dx\ dy\\&=\int_{\mathbb{Y}}\left[\int_{\mathbb{X}} p(x,y)\ dx\right]\log p(y)\ dy\color{blue}{-\int_{\mathbb{Y}}\int_{\mathbb{X}}\log p(x,y)\ dx\ dy}\\&=\int_{\mathbb{Y}}p(y)\log p(y)\ dy\color{blue}{+H(X,Y)} \end{align}$$ I reckon you can take it from here.