Does entropy increase exponentially as the number of observations increase?

91 Views Asked by At

Is there an example that demonstrates that Shannon's information entropy ($H(X)$ formula below) will always increase exponentially as the number of observations in the data increases (preferably using the natural logarithm, $\ln$)? What implications does this have for using it as a measure of uncertainty, and for using estimators of entropy?

Is there a work-around for this, and for which applications would someone want to suppress this behavior?

$$H(X) = -\sum_i p_i\ln(p_i)$$