How to design code for arbitrary low error rate with achievable code rate under Binary Erasure Channel (BEC)

248 Views Asked by At

"I am looking for a code that can lower my error rate with a fixed code rate."

Assume a binary erasure channel that a bit is probably erased in probability 0.5, and any bit won't flip.

Then, the achievable rate R is defined as following: For a channel (,p(Y|X),) a rate R is achievable if and only if there for every ε>0 there exists an (M,n) code with (log M)/n ≥ R and λ ⩽ ε.

I would like to verify if R=1/2 is achievable rate, so I design the code: Index set is {1,2} only, the codebook is {00, 11}, respectively. In this code, M = 2 and n = 2, and it satisfies (log M)/n ≥ R. How about the error rate? The error only occurs when the received bits are both erased, which has probability 0.5*0.5=0.25.

If R = 1/2 is achievable rate, I should be able to find another (M,n) code that has a lower error rate ε. We are sure R = 1/2 is achievable rate because the channel capacity is 0.5 (calculated by 1 - 0.5). Do any one have idea about the code?

Thanks in advanced!

2

There are 2 best solutions below

2
On BEST ANSWER

The information theoretic limits become evident in practice with long codewords (i.e., $n\gg1$). I doubt there is any sophisticated algebraic approach for code design with very small $n$ (say, less than $10$), apart from brute-force computer search. Also, consideration of a rate $R=C$ makes the search for a good code a much more difficult, if not impossible, task (compared to the case $R<C$), even when large values of $n$ are considered.

There is a vast literature providing excellent code designs for the BEC with moderate-to-large $n$, typically based on graphs (e.g., Fountain codes).

0
On

To add to Stelios' answer (+1): there is indeed a very simple (practically trivial) code ... but if feedback is allowed: just resend the previous symbol if the decoder finds out it was erased. In this scheme, the number of times that each symbol is sent follows a geometric distribution, with mean $2$ - hence it achieves the capacity.

Of course, one often does not have a the luxury of having feedback in the coding-decoding process. Nevertheless, it's important to know that Shannon's channel theorem is also valid in that case.