Inconsistency of capability of random coding in information theory and coding theory

48 Views Asked by At

In information theory, it is well known that the capacity of a channel can be achieved asymptotically using random coding method (Section 7 of Cover & Thomas' textbook).

However, in coding theory, it is only shown that we can apply random coding method to get a code that achieves Gilbert-Varshamov bound (I will show how this is related to channel capacity and the inconsistence later). To be precise, denote $B_n(r) =| \{ x \in \mathbb{F}_2^n: \text{wt}(x)\le r\}|$ (the size of Hamming ball with radius $r$)

Theorem:(G-V bound) There exists a code $\mathcal{C} \subset \mathbb{F}_2^n$ such that the distance is $d$ and the size is $|\mathcal{C}|=\frac{2^n}{2B_n(d-1)}$.

This result can be shown by choosing $x_1,x_2,\dots, x_{|\mathcal{C}|}$ uniformly and independently. For the proof details, see the lecture note.

Choose $d=2\delta n$ and take the limit $n\to \infty$, we have $B_n(d-1) \approx 2^{nh(2\delta)}$ (see the asymptotic property of binomial coefficient, also page 353 of the textbook). This code can correct $t\approx \delta n$ errors. Apply this code to a binary symmetric channel with transition probability $\delta$, then we get the message rate $R = \frac{\log |\mathcal{C}|}{n} = 1-h(2\delta)$ with vanishing error probability.

The inconsistency: In information theory proof, we use random coding method to achieve the rate $R=1-h(\delta)$ asymptotically with random coding. In G-V bound proof, we only get $R=1-h(2\delta)$ with the same random coding method.

Why is there such a gap on the capability of random coding? Is it possible to use random coding to construct a code with distance $d$ and rate $R \approx 1- h(\frac{d}{2n})$? If yes, does that mean we can get an improved version of G-V bound (in $n \to \infty$ limit) with the same random coding?

1

There are 1 best solutions below

4
On

The GV bound is a lower bound on the capacity, so there is no inconsistency. It just means the GV bound is not tight.