find entropy and mutual information

54 Views Asked by At

Let $X_1, X_2$ i.i.d. from $\text{Binom}(1, \frac{1}{2})$. Let $Y=\max(X_1, X_2)$. How can we obtain the following quantities?

1) $H(Y)$

2) $I(X_1; Y)$

3) $I(X_1, X_2; Y)$

$H$ is entropy, $I$ is mutual information.

1

There are 1 best solutions below

0
On BEST ANSWER

If $X_1$, $X_2$ are both Bernoulli with $p$, we have: $$ P(Y=0)=P(X_1=0)P(X_2=0)=(1-p)^2\implies P(X=1)=1-(1-p)^2 $$ The entropy of $Y$ would be easy to find from here:

$$ H(Y)=-(1-p)^2\log(1-p)^2-(2p-p^2)\log(2p-p^2). $$

This corrsponds to a channel with the following relation: $$ Y=X_1+X_2-X_1X_2. $$ In any case, $Y$ is a function of $X_1,X_2$: $H(Y|X_1,X_2)=0$. Hence:

$$ I(X_1,X_2;Y)=H(Y). $$

Finally if $X_1=1$ then $Y=1$ so: $$ H(Y|X_1=1)=0. $$ and if $X_1=0$, then $Y=X_2$. So: $$ H(Y|X_1=0)=H(X_2). $$ In other words:

$$ I(X_1;Y)=H(Y)-P(X_1=0)H(X_2). $$

If $p=\frac 12$, $H(Y)=\frac 14\log 4+\frac 34\log \frac 43$ and $H(X_2)=1$. (Base of logarithm is 2.)