Can infinite random sequences be asymptotically compressed?

83 Views Asked by At

A number $0.5<p<1$ is chosen at random and given to two people A and B whom are allowed to communicate before beeing separated. A is then given a sequence S of N random bits where each bit has probability p of being 1 and 1-p of being 0. He must send the least number of bits (which we will denote M) possible to B, such that B must be able to reconstruct the sequence S. As N goes to infinity, must the expected value of N/M tend to 1?

Shannons source code theorem claims we can get "arbitrarily close to the Shannon entropy, with negligible probability of loss." What is negligible probability of loss, negligible compared to what? What percent of bits get sucessfully transfered using this method?

2

There are 2 best solutions below

5
On BEST ANSWER

must the expected value of N/M tend to 1

I assume that $M$ is not given to $A$ but that he can choose it (for each message), so that the message can be decoded with no error.

Then, by the first Shannon theorem the expected value of $M$ (assuming that A simply compress the bit stream using non-singular compressor) will tend to $ M H(p)$

2
On

In your setup, the bits get transported succesfully, but it is possible that the random bit sequence is so unusual that the agreed upon encoding requires significantly more bits than expected. In thge long run, everything will (almost surely) level out, of course.

The problem you cite is more along the following, from a practical point of view: If we have a communication channel that is capable of transporting only so-and-so many encoded bits (per second, say) then we may usually manage to communicate many more of the biased random bits by using a suitable encoding, but from time to time "unusual patterns" occur that would require us so send a longer encoded message; thus we must either accept that we might fall behind in time (which may be undesirable for real-time applications) or drop some information on such occasion.