Applying Shannon's Noiseless Coding Theorem

236 Views Asked by At

I'm working through some exam questions in preparation for exam season. We are given:

"A discrete memoryless source W outputs the symbol a with probability 1/2, the symbol b with probability 1/4, the symbol c with probability 1/8 and the symbol d with probability 1/8... a->111, b->011, c->01, d->0"

Following Huffman encoding I have that a->0, b->10, c->110, d->111.

The Question is: "Show that the average word length of this Huffman encoding satisfies the lower bound given by Shannon’s Noiseless Coding Theorem."

So easy to compute that H(w) = 1/2(1) + 1/4(2) + 1/8(3) + 1/8(3) = 7/4

But I don't understand how to compute ¯n so I can't attain ¯n = 7/4 to conclude!