In information entropy, how do nats relate to any representation of states?

153 Views Asked by At

Calculating the information entropy depends on taking the logarithms of probabilities in some base. If I use base 2, then the entropy is in "bits". The measure of bits is close to the representation of states as a binary tree and binary digits.

Is there a representation that corresponds to "nats", calculating the entropy using ln instead of log base 2?

For "bits" and log base 2, I can use the Huffman Encoding strategy to encode my states such that the weighted average of the states is close to the information entropy using log base 2. For example, if I calculate the probability of all the 2-card blackjack hands, I can build a binary tree and serialize the states as

Value Representation
4 0100111
16 0011
20 100
21 00010

If I use ln instead of log base 2, the resulting entropy is in "nats".

Measure Value
Weighted average length 3.979
Entropy (bits) 3.93419124
Entropy (nats) 2.726973564

The log base 2 value corresponds roughly to a binary tree / binary representation, but that doesn't work for nats.

Is there a representation that corresponds to the "nats" value?