Binary symmetric channel, Average probability of error goes to zero

827 Views Asked by At

Let us have the following definition: We have a W which is an alphabet with |W| = $2^n$ (the number of elements), and a codebook C which maps uniquely each element of the alphabet to the binary sequence of length n. The code is sent over the n-th extension of the binary symmetric channel with crossover probability of p , that is, P[X=1, Y=0] = P[X=0, Y=1] = p. When received, it is decoded with $C^{-1}$.

My question is, how can we find and show conditions on p so that the average probability of error of the channel goes to zero, as $n \rightarrow \infty $ ?

1

There are 1 best solutions below

4
On BEST ANSWER

The capacity of the binary symmetric channel is $1-H(p)$ where $H(p)$ is the Shannon entropy of a Bernoulli(p) distribution (see, for example, Cover and Thomas, Elements of Information Theory 2e, Chapter 7).

The rate of a (M,n) code (which maps M input values to a binary vector of length $n$) is $\frac{\log M }{n}$.

The channel coding theorem says that you can achieve any rate below the capacity (i.e. the average/maximum probability of error goes to zero).

In your case, it looks like you want a $(2^n,n)$ code, which has rate $1$, so you can only communicate with vanishing probability of error if $p=0,1$.