Entropy of XOR between two Bernoulli variables

102 Views Asked by At

I got: $X$ and $Y$ independent Bernoulli random variables, and $Z = X \oplus Y$ (note that $Z$ is also Bernoulli variable), entropy values are:

$$H(Z)=−\left[(1−p)(1−q)+pq\right]\log\left[(1−p)(1−q)+pq\right]−$$ $$-\left[p(1−q)+(1−p)q\right]\log\left[p(1−q)+(1−p)q\right]$$

$$H(X)=−p\log p−(1−p)\log(1−p)$$

$$H(Y)=−q\log q−(1−q)\log(1−q)$$

I need to solve the question:

$H(Z)$ is it smaller or larger than $\max\{H(X),H(Y)\}$? Does someone have an advice what it the way of proof? I think $H(z)$ is bigger, but don't know how to prove it.

Thanks in advance :)

2

There are 2 best solutions below

0
On

As you point out the sum $Z$ is also Bernoulli. The entropy is completely determined by its parameter $p_z=pq+(1-p)(1-q),$ and the entropy is monotone increasing on $(0,1/2]$ and monotone decreasing on $[1/2,0).$ There is also the symmetry property of entropy you can appeal to.

To prove your claim you need to show an inequality between $p_z$ and $\max\{p,q,1-p,1-q\}$, or alternatively show one doesn't exist or show the opposite inequality.

0
On

Since $X \oplus Y \oplus X = Y,$ $$ H(Z) \ge H(Z|X) = H(X \oplus Y|X) = H(Y|X) = H(Y),$$ where the last equality uses the independence of $X$ and $Y$. Exchanging the role of $X$ and $Y$, we also get $H(Z) \ge H(X)$, ergo $H(Z) \ge \max(H(X), H(Y))$.