Simplifying Entropy $H(X-Y,X+Y)$

80 Views Asked by At

Let $X$ and $Y$ be two independent discrete random variables. Is it possible to simplify the joint entropy $H(X-Y,X+Y)$?

From my understanding, if $X$ and $Y$ are independent, then $H(X+Y) = H(X,Y)$. However, I'm unsure how to proceed since $X-Y$ and $X+Y$ aren't independent. Ideally I want to get $H(X-Y,X+Y)$ in terms of $H(X)$ and $H(Y)$.

2

There are 2 best solutions below

0
On

As Stelios comments, $H(X+Y) = H(X,Y)$ is false. You probably meant $H(X,Y)=H(X)+H(Y)$, which is quite a different thing.

Now, calling $U=X+Y$, $V=X-Y$, we have $X=\frac12 (U+V)$ and $Y=\frac12(U-V)$

Hence the pairs $(X,Y)$ and $(U,V)$ are related by a one-to-one function. Hence they carry the same information, and their (joint) entropies are the same.

That is

$$H(X+Y,X-Y)=H(U,V)=H(X,Y) = H(X)+H(Y)$$

The last equality holds only when $X,Y$ are independent.

0
On

$$H(X+Y,X-Y)=H(X)+H(Y)$$


Proof: $$ H(X+Y,X-Y)=\\ -\sum_{x+y}\sum_{x-y}p(x+y,x-y)\log(p(x+y,x-y)) $$ However, since $2x=(x+y)+(x-y)$ and $2y=(x+y)-(x-y)$, all of the distinct terms in the sums are reproduced if we sum $\sum_x\sum_y$, and similarly the probabilities $p(x+y,x-y)=p(x,y)=p(x)p(y)$, so $$ H(X+Y,X-Y)=-\sum_x\sum_y p(x)p(y)\log p(x)p(y)\\ =H(X)+H(Y). $$

It might be interesting to do the calculation for a simple case such as $X=0,1$ and $Y=0,1$ with equal probabilities and check that the sums come out correctly.

Incidentally if $X$ and $Y$ are not independent, by almost the same argument as above, $$ H(X+Y,X-Y)=H(X,Y). $$ The only thing which needs to change is that we must write the probabilities as $p(x,y)$ not $p(x)p(y)$. That results in the usual formula for $H(X,Y)$.