Entropy of a sum

544 Views Asked by At

Imagine we have two random variables $X$ and $Y$, and a third random variable $Z = X + Y$.

Being the entropy of $Z: H(Z) = H(X) + H(Y) + H(Z \vert X)$

In which cases the entropy of one of the variables $(H(Y)$ or $H(X))$ is higher than the entropy of the sum $H(Z)$?

1

There are 1 best solutions below

0
On

The relation $H(Z)=H(X)+H(Y)+H(Z|X)$ is wrong. A quick way to check is let $X$ be a constant. Then it gives $H(Y)=2H(Y)$! (I imagine you are considering the discrete case).

The general answer is involved and depending on the support set of random variables, one needs to write down the entropies and find set of inequalities parametrized by the probability distribution of symbols of these random variables.

For example, consider the case where both $X\sim Bern(p_1)$ and $Y\sim Bern(p_2)$. Hence we should have $$h(p_1 p_2,p_1 (1-p_2),(1-p_1) p_2,(1-p_1) (1-p_2)) \leq \max(h(p_1,1-p_1),h(p_2,1-p_2)).$$ One needs to open these terms to derive the desired conditions.

However, to give intuition, notice that if $X$ and $Y$ are independent, then $$H(Z) \geq H(Z|Y) = H(X),$$ $$H(Z) \geq H(Z|X) = H(Y).$$ Hence, for the case of independent we know that it's not true. The desired relation holds if "certain" correlation exists between $X$ and $Y$, \textit{e.g.}, when $Y=-X$: