How to compare two variables whose differential entropy are both negative?

277 Views Asked by At

I know that a higher positive value for entropy indicates greater uncertainty, but not sure how this works when comparing two negative values

  1. If two continuous random variables $X$ and $Y$ have differential entropies of $-1.3$ and $-0.6$ respectively, which one has more uncertainty/disorder?
  2. Also, given that the differential entropy of the uniform distribution $\sim U(0,\frac{1}{2})$ is well-known to be $$-\log(2) = -0.693147$$ and for the Gaussian $\sim N(0,1)$ it's $$\frac{1}{2}\log(2\pi e \sigma^2) = 1.42$$, what can be said about the previous two values compared to the uniform entropy and Gaussian entropy?

(Differential entropy here should not be confused with discrete Shannon entropy, which is always non-negative.) The differential entropy article does not explain how to compare two negative values.

1

There are 1 best solutions below

1
On BEST ANSWER

Chapter 8 of Cover and Thomas' book treats differential entropy. One of the main points is the following:

While the differential entropy $h(X)$ can be negative, the quantity $2^{h(X)}$ is the volume of the support set, which is always nonnegative.

In parallel with the discrete theory, there is an AEP property, so by the weak law of large numbers the sample entropy converges to $h(X)$ and we have, in particular for $X=(X_1,\ldots,X_n)$ an i.i.d. vector $$ \mathrm{Vol}(A_{\varepsilon}^{(n)})\leq 2^{n(h(X)+\varepsilon)}, \quad \forall n\geq 1, $$ where the typical set $A_{\varepsilon}^{(n)}$ is the subset of $\mathbb{R}^n$ where the sample entropy per symbol and the source entropy differ by at most $\varepsilon.$

So to answer your question, since $h(X)=-1.3,$ and $h(Y)=-0.6$, $X$ is supported on a set of volume $2^{-1.3}$ which is smaller than the volume supporting $Y$, namely $2^{-0.6}.$ Thus as in the discrete case $h(X)<h(Y)$ implies that $Y$ has higher uncertainty than $X.$