Relation between independence number and channel capacity

175 Views Asked by At

Suppose $P_{Y|X}$ is a discrete memoryless channel with confusability graph $G$ and capacity $C = max_{P_X}I(X; Y )$. I want to prove the following relation:

$\log{\alpha(G)}\le C$

where $\alpha(G)$ is the independence number of the graph $G$.

I have the intuitive feeling of why is this the case: the independence number gives me the size of the maximal message set that I can perfectly communicate. Then my channel has to support at least the information content of this message in bits, thus the $\log{\alpha(G)}$. But how do I prove this??

1

There are 1 best solutions below

0
On BEST ANSWER

By the converse to the channel coding theorem, if there exists a sequence of $\left(2^{nR},n \right)$-codes such that the average probability of decoding error goes to zero as $n\to \infty$, then $R \leq C$, where $C$ is the channel capacity. Therefore, it is sufficent to construct such a code with rate $R=\log \alpha (G)$.

Consider the following code with block length $1$ (n=1). Take $\alpha(G)$ messages, send symbol $i$ to send message $i$. Since the set of symbols that are used by the code form an independent set over the confusability graph, the probability of error is zero (even for a finite block length!). Since this is a $\left( 2^{\log \alpha(G)}, 1\right)$ code, the result follows.