How to know the length of message from the entropy

163 Views Asked by At

I have a message from four symbols and each symbol has a different probability.

For example: a message is built by symbols A, B, C, D, and the probabilities of there occurrences are x,y,z,t.

The question is, how can I calculate or know the required number of these symbols, which makes the entropy of message equal to or bigger than N bits for example 200 bits?

I know that I have to use Shannon's formula, but how?

1

There are 1 best solutions below

3
On BEST ANSWER

You can find what you need at:

http://bearcave.com/misl/misl_tech/wavelets/compression/shannon.html

Your average bits per symbol is:

$H = -[x \log_2{x} + y \log_2{y} + z \log_2{z} + t \log_2{t}]$

And the number you're looking for is N/H (on average)