I got: $X$ and $Y$ independent Bernoulli random variables, and $Z = X \oplus Y$ (note that $Z$ is also Bernoulli variable), entropy values are:
$$H(Z)=−\left[(1−p)(1−q)+pq\right]\log\left[(1−p)(1−q)+pq\right]−$$ $$-\left[p(1−q)+(1−p)q\right]\log\left[p(1−q)+(1−p)q\right]$$
$$H(X)=−p\log p−(1−p)\log(1−p)$$
$$H(Y)=−q\log q−(1−q)\log(1−q)$$
I need to solve the question:
$H(Z)$ is it smaller or larger than $\max\{H(X),H(Y)\}$? Does someone have an advice what it the way of proof? I think $H(z)$ is bigger, but don't know how to prove it.
Thanks in advance :)
As you point out the sum $Z$ is also Bernoulli. The entropy is completely determined by its parameter $p_z=pq+(1-p)(1-q),$ and the entropy is monotone increasing on $(0,1/2]$ and monotone decreasing on $[1/2,0).$ There is also the symmetry property of entropy you can appeal to.
To prove your claim you need to show an inequality between $p_z$ and $\max\{p,q,1-p,1-q\}$, or alternatively show one doesn't exist or show the opposite inequality.