Let $X$ and $Y$ be two independent discrete random variables. Is it possible to simplify the joint entropy $H(X-Y,X+Y)$?
From my understanding, if $X$ and $Y$ are independent, then $H(X+Y) = H(X,Y)$. However, I'm unsure how to proceed since $X-Y$ and $X+Y$ aren't independent. Ideally I want to get $H(X-Y,X+Y)$ in terms of $H(X)$ and $H(Y)$.
As Stelios comments, $H(X+Y) = H(X,Y)$ is false. You probably meant $H(X,Y)=H(X)+H(Y)$, which is quite a different thing.
Now, calling $U=X+Y$, $V=X-Y$, we have $X=\frac12 (U+V)$ and $Y=\frac12(U-V)$
Hence the pairs $(X,Y)$ and $(U,V)$ are related by a one-to-one function. Hence they carry the same information, and their (joint) entropies are the same.
That is
$$H(X+Y,X-Y)=H(U,V)=H(X,Y) = H(X)+H(Y)$$
The last equality holds only when $X,Y$ are independent.