Mutual information between the top side and the bottom side of the coin?.
T is the top side, B is the bottom side.
$$I(T;B) = H(B) - H(B|T) = \log(2) = 1$$
the log base is 2.
I don't know why the answer is.
$H(B)=\frac12\log2? \quad H(B)=\log2?$
Because p(B)=p(T)=0.5
I think $H(B)=\frac12\log2$ and $H(B|T)=0$, but the answer is $$I(T;B) = H(B) - H(B|T) = \log(2) = 1.$$
from the solutions-elements-of-information-theory-2nd-edition-complete
You ought to check the definitions more carefully. These are very elementary calculations.
The probabilities are $$P(T\text{ happens})=\frac12,P(B\text{ happens})=\frac12$$ $$P(T\text{ happens}|B\text{ happens})=P(B\text{ happens}|T\text{ happens})=0$$ The marginal entropy $$H(B)=H(T)=-\sum p\log p\\=-0.5\log0.5-0.5\log0.5=\log2=1$$ The conditional entropies $$H(T|B)=H(B|T)=0$$ since one and only one of $B$ and $T$ can happen, if you know about one of them, the other is also determined. Therefore, $$I(T;B) = H(T) - H(T|B) = 1$$ $$I(B;T) = H(B) - H(B|T) = 1$$ In fact, it is always true that $I(T;B)=I(B;T)$.