Shannon's Noisy Coding Theorem

156 Views Asked by At

I'm confused by what it means when it says rate. At the bottom of page 15 and page 16 here http://www0.maths.ox.ac.uk/system/files/coursematerial/2014/3093/6/Lecture_notes.pdf it seems to be saying that the rate is the same as the entropy of the source. Shouldn't it instead say that we need $H/(log_2k)$ to be less than the channel capacity, where each source symbol $x\in A$ is coded by $c:\ A\to\ B^k$ as $c(x)$ which has length k?

1

There are 1 best solutions below

2
On

Entropy $H$ as stated is per source symbol. A source output of $n$ symbols has entropy $nH$. Note that the Theorem on p. 15 states that you're encoding a message of length $n.$