Lower bound for limit of length of codeword

33 Views Asked by At

Here is the question I'm trying to solve. I don't really have any idea how to approach it/what theorem to use.

For $ p, \lambda >0$, let $m(n,p,\lambda)$ be defined to be the least $m$ such that for any distribution on the space $\{0,1\}^n$ (of possible messages $\textbf X$) there exists an encoding function $c:\{0,1\}^n \to\{0,1\}^m$ and a decoding function $d:\{0,1\}^m \to\{0,1\}^n$, which when used with the binary symmetric channel with error probability $p$ give $\mathbb P(d(\textbf Y)=\textbf X) > 1-\lambda$.

Prove that $$\lim_{\lambda \to 0} \lim_{n \to \infty} \frac{m(n,p,\lambda)}{n} \geq \frac{1}{1-h(p)}$$ where $h(p)$ is the binary entropy function.

1

There are 1 best solutions below

0
On

This is the capacity theorem for the binary symmetric channel, the reciprocal of the left hand side being the rate, and $C=1-h(\lambda)$ the capacity.