Let $X_1, X_2$ i.i.d. from $\text{Binom}(1, \frac{1}{2})$. Let $Y=\max(X_1, X_2)$. How can we obtain the following quantities?
1) $H(Y)$
2) $I(X_1; Y)$
3) $I(X_1, X_2; Y)$
$H$ is entropy, $I$ is mutual information.
Let $X_1, X_2$ i.i.d. from $\text{Binom}(1, \frac{1}{2})$. Let $Y=\max(X_1, X_2)$. How can we obtain the following quantities?
1) $H(Y)$
2) $I(X_1; Y)$
3) $I(X_1, X_2; Y)$
$H$ is entropy, $I$ is mutual information.
If $X_1$, $X_2$ are both Bernoulli with $p$, we have: $$ P(Y=0)=P(X_1=0)P(X_2=0)=(1-p)^2\implies P(X=1)=1-(1-p)^2 $$ The entropy of $Y$ would be easy to find from here:
This corrsponds to a channel with the following relation: $$ Y=X_1+X_2-X_1X_2. $$ In any case, $Y$ is a function of $X_1,X_2$: $H(Y|X_1,X_2)=0$. Hence:
Finally if $X_1=1$ then $Y=1$ so: $$ H(Y|X_1=1)=0. $$ and if $X_1=0$, then $Y=X_2$. So: $$ H(Y|X_1=0)=H(X_2). $$ In other words:
If $p=\frac 12$, $H(Y)=\frac 14\log 4+\frac 34\log \frac 43$ and $H(X_2)=1$. (Base of logarithm is 2.)