Conditional Mutual Information - Cover exercise

85 Views Asked by At

Following is the exercise from T. Cover's Elements of Information Theory

enter image description here

While I understand that $$I(X_{n-1}; X_n \mid X_1, \ldots, X_{n-2}) = H(X_{n-1} \mid X_1, \ldots, X_{n-2}) - H(X_{n} \mid X_1, \ldots, X_{n-2}, X_{n-1}), $$ $$I(X_{n-1}; X_n \mid X_1, \ldots, X_{n-2}) = H(X_{n-1}).$$ But I don't understand how $H(X_{n-1})$ = 1 bit (meaning how do we know that $X \sim \text{Ber}(1/2)$).

1

There are 1 best solutions below

0
On

Your first equality is wrong, it should be

$$I(X_{n-1}; X_n \mid X_1, \ldots, X_{n-2}) = H(X_{n} \mid X_1, \ldots, X_{n-2}) - H(X_{n} \mid X_1, \ldots, X_{n-2}, X_{n-1})$$

The last term is zero because, given $X_1, \cdots X_{n-1}$ is known.

For the other term:

Given $X_1,…,X_{n−2}$ there are two alternatives for the last two bits.

If $X_1,…,X_{n−2}$ has even number of $1$'s, then either $(X_{n-1},X_n)=(0,0)$ or $(X_{n-1},X_n)=(1,1)$ with equal probability.

If $X_1,…,X_{n−2}$ has odd number of $1$'s, then either $(X_{n-1},X_n)=(1,0)$ or $(X_{n-1},X_n)=(0,1)$ with equal probability.

Hence $H(X_{n} \mid X_1, \ldots, X_{n-2}) =1$ bit