Mutual Information of the given expression

54 Views Asked by At

If we are given that a discreet random variable $ X $ gives rise to two discreet random variables $(Y_1,Y_2)$ according to pmf $P_{Y_1,Y_2|X}(y_1,y_2|x)$. It is given that the conditional marginal pmfs are the same ,ie, $P_{Y_1|X} = P_{Y_2|X}$. Define $Y \sim P_{Y|X}$ (the conditional marginal pmf of the 2 random variables as described before). The question is to prove $$ I(X;Y_1,Y_2) = I(X;Y)$$

I brought it down to proving $$H(X|Y_1,Y_2) = H(X|Y)$$ but I can't proceed any further!

1

There are 1 best solutions below

2
On BEST ANSWER

This is not true in general - suppose $Y_1$ and $Y_2$ are conditionally independent given $X$. Then the question becomes if two transmissions over a DMC with the same input yield the same information as one transmission. This is clearly false (for the same reason as why collecting many samples from a fixed distribution characterises it better).

Concretely, consider $X$ that is a uniform bit, and $Y_1, Y_2$ to be outputs of independent uses of a binary erasure channel with erasure probability $\varepsilon$, that is, $Y$ taking the values $\{0,1,?\}$ with the conditional law $$ P(Y=y|X) = (1-\varepsilon)\delta_{y,X} + \varepsilon \delta_{y,?},$$ where $\delta$ is the Kronecker $\delta$. In words, $Y$ takes the same value as $X$ with probability $1-\varepsilon$, and is 'erased,' i.e. takes the value $?,$ with probability $\varepsilon$.

Then $$ I(X;Y_1,Y_2) = H(X) - H(X|Y_1, Y_2) = H(X) - P(Y_1 = Y_2 = ?) = 1- \varepsilon^2,$$ while $$ I(X;Y) = H(X) - H(X|Y) = H(X) - P(Y = ?) = 1- \varepsilon. $$