Entropy of sum of two (potentially) dependent R.Vs

1.1k Views Asked by At

I searched all over the internet and couldn't find any general upper bound to $H(X+Y)$ where $X$ and $Y$ are two random variables that are not necessarily independent. Is there any such upper bound? What are the alternatives?

1

There are 1 best solutions below

3
On BEST ANSWER

Using $ H(Z)=H(X)+H(Z \mid X) - H(X\mid Z)$ with $Z=X+Y$ we get:

$$\begin{align} H(X+Y)&=H(X)+H(Y\mid X)- H(X \mid X+Y)\\ &=H(Y)+H(X\mid Y)- H(Y \mid X+Y)\\ &=H(X,Y)- H(X \mid X+Y)\\ \end{align}$$

Now, $H(X,Y) \le H(X)+H(Y)$ (equality iff $X,Y$ are independent) and $H(X \mid X+Y)\ge 0$ (equality iff knowing the sum lets me know the summands; i.e, iff $x_1+y_1 \ne x_2+y_2$ for any set of values with positive probability).

Then $$H(X+Y)\le H(X)+H(Y)$$ is a valid general bound. If the variables are dependent and we are given $H(X,Y)$ , then

$$H(X+Y)\le H(X,Y)$$ would be a tighter bound.