Quantity of information in a 8 letter word

94 Views Asked by At

How much information has a 8 letter word, where you can only use letters a and b and probability of a is $\dfrac{2}{3}$ and b is $\dfrac{1}{3}$. I know that I have to probably use Shannon Entropy so I have given it a try.

$H=-\displaystyle\sum_{i=1}^{n}p_i\ln(p_i)=-(\frac{2}{3}\ln(\frac{2}{3})+\frac{1}{3}\ln(\frac{1}{3}))\approx0.636$

What do I have to do now? As far as I know that is a number of bits needed to "store" the information of one letter in that word. How do I continue from this point?

1

There are 1 best solutions below

0
On BEST ANSWER

What you have computed is the entropy per symbol (letter). This can be interpreted, as you say, as the number of bits needed to code each letter, bu also as the average amount of information that each letter provides.

If the word has eight letters, then the information per word is eight times that value. (Actually this assumes that the letters are independent, but lacking more information that's what you are expected to assume, I guess)