Mutual information staying constant under composition of channels

63 Views Asked by At

Consider the following scenario: one has 2 communication channels $C_1$ and $C_2$. Let $p_0(x)$ be some arbitrary but fixed input probability distribution.

The mutual information between the input and the output of $C_1$ must be greater or equal than the mutual information between the input and the output of the composed channel $C_2\circ C_1$ (i.e. act with $C_1$ first then feed the output to $C_2$). This follows from the data processing inequality. What are the necessary and conditions on $C_2$ so that one has equality? That is, the mutual information is not decreasing but stays the same under the composition? Of course if $C_2$ is identity, or a permutation channel, then it is true.

PS: I hope it is clear what I mean by mutual information between the input and output of a channel, it is the mutual information of the joint probability distribution obtained by multiplying the elements of the transition matrix with the corresponding component of the input, $p(x,y)=p(y|x)p_{0}(x)$.