A number $0.5<p<1$ is chosen at random and given to two people A and B whom are allowed to communicate before beeing separated. A is then given a sequence S of N random bits where each bit has probability p of being 1 and 1-p of being 0. He must send the least number of bits (which we will denote M) possible to B, such that B must be able to reconstruct the sequence S. As N goes to infinity, must the expected value of N/M tend to 1?
Shannons source code theorem claims we can get "arbitrarily close to the Shannon entropy, with negligible probability of loss." What is negligible probability of loss, negligible compared to what? What percent of bits get sucessfully transfered using this method?
I assume that $M$ is not given to $A$ but that he can choose it (for each message), so that the message can be decoded with no error.
Then, by the first Shannon theorem the expected value of $M$ (assuming that A simply compress the bit stream using non-singular compressor) will tend to $ M H(p)$