Computing the entropy of a hidden word based on amount of information in each guess

106 Views Asked by At

Say I'm playing hangman where I'm trying to guess a 7-letter word. To guess the word, I pick one letter at a time, and if this letter appears in the word, it gets filled in.

enter image description here

Now let's assume the space from which words are chosen contains 12 7-letter words in a language whose alphabet only has 12 different letters.

I don't know the word that I'm guessing, but I do know how much information (in bits) I gain on average when I guess a letter in this word correctly. This information is summarized in the table below.

letters guessed E[I] possible word matches
1 0.78 7.0
2 1.54 4.1
3 2.16 2.7
4 2.61 2.0
5 2.98 1.5
6 3.30 1.2
7 3.58 1.0

Is there anything I can learn about the amount of entropy this word has based on the data in the table above?

Edit:

This is how I'm computing E[I] enter image description here