Minimum mean length of code words

40 Views Asked by At

I need help with this task, so if anyone is willing to help me, I would be grateful. The task is: Given a discrete information source that generates symbols from the set {M,A,$} with probabilities {0,4;0,4;0,2} respectively.

a) Determine the entropy of the source. What is the minimum mean length of lossless code words (binary coding) according to Shannon's first theorem?

And now I have calculated the entropy which amounts to: $H(S) = 1,52 [\frac{sh}{simb}]$.

In the solution it is written that the Minimum mean length of code words according to Shannon's first theorem is:$E(L)\geq\frac{H(S)}{log_2D} = E(L)\geq 1,52 [\frac{b}{simb}] \rightarrow E(L) = 1,52 [\frac{b}{simb}] $

I referred here to the Huffman coding algorithm, which guarantees the minimum mean length of the codewords, and through it obtains that the minimum mean length of the codewords is equal to: $E(L) = 1,6 [\frac{b}{simb}]$

I know that Shannon's algorithm does not guarantee the minimum mean length of codewords, unlike Huffman's algorithm which does. So I wonder why these two values ​​are not the same, when they both represent the minimum mean length of the codewords?

Thanks in advance!

Best regards!