Information theory:Channel capacity equivalence proof

59 Views Asked by At

I have a problem that I can't prove: We have 3 input symbols (a,b,c) and two output symbols (x,y). If the first symbol is a then the probability of getting x is p and probability of getting y is 1-p. If the first symbol is b or c then the probability of getting x is q and 1-q for y. Prove that the capacity of this channel would be the same if there was no symbol c. This is the matrix

1

There are 1 best solutions below

0
On

Consider the probability distribution of your source $S$: $$P(S=a)=\alpha,\ P(S=b)=\beta,\ \textrm{and }P(S=c)=\gamma.$$ Then you can write the distribution of the output $O$ in terms of $\alpha$, $\beta$ and $\gamma$. You should realize that it depends only on $\alpha$ and $(\beta+\gamma)=1-\alpha$. So does its entropy $H(O)$. Now, the same thing happen when you compute the conditional entropy $$H(O\mid S)=\alpha h(p)+(\beta+\gamma)h(q).$$ The mutual information of $S$ and $O$ $$I(S,O)=H(O)-H(O\mid S)$$ is therefore a function of $\alpha$ and $1-\alpha$. It reaches its maximum value, the capacity of the channel, independently on the subditribution of $1-\alpha$ between $\beta$ and $\gamma$, and exactly as in the case of the second channel (the one without $c$, or if you prefer $\gamma=0$).