In my research I find an equation featuring the "exponential entropy" term $\mathrm{e}^{H(p_{i})}$ and I wonder if it has a specific meaning. I have only found rare references to that term (usually in terms of dispersion or "spread of the distribution") so I'm looking for more insights. I work with natural logarithms and in my case the entropy is Shannon's: $H(p_{i})=-\sum{ p_{i}\ln p_{i}}$... My question is: what is $\mathrm{e}^{H(p_{i})}$ ?
Note: I assume that the same question would arise if I were to work in log-base 2... So is there a meaning to $2^{H(p_{i})}$ when entropy is now defined by $H(p_{i})=-\sum{p_{i}\log_{2} p_{i}}$ ?
The term $2^{H(p)}$ can be interpreted in terms of the notion of a typical set, which plays a fundamental role in information theory.
Imagine that you draw $n$ i.i.d. samples from some distribution $p$. Let $x_0, x_1, \dots, x_n$ indicate the sequence of samples you drew. The law of large numbers can be used to show that as $n\to \infty$, $p(x_1,\dots,x_n)\to 2^{-nH(p)}$. In other words, in the limit of large $n$, the distribution over sequences will basically become a uniform distribution over a set of size $2^{nH(p)}$ sequences.
A bit more formally, for any $\epsilon >0$ and a large enough $n$, there is a "typical set" of trajectories $A_\epsilon^n$ of size $|A_\epsilon^n| \approx 2^{nH(p)}$, such that a random $n$-long trajectory will belong to $A_\epsilon^n$ with high probability ($\ge 1-\epsilon$). (See the WP link for more precise details).
Note that $2^{nH(p)} = \prod_{i=1}^n 2^{H(p)}$. Thus, in the limit of large $n$, $ 2^{H(p)}$ is the multiplicative rate with which the size of the typical set grows with each additional sample from $p$.