The differential entropy cannot be negative for pdf in (0,1)

99 Views Asked by At

Differential entropy can be negative for some functions, but when the domain is an interval of length $1$, I suspect it can not be.

Claim. For any probability density function $f$ with domain $(0,1)$,

$$-\int_0^1 f(x) \log f(x)\ dx \geq 0.$$

Is this claim true?

Note. I omitted a minus sign everywhere and the question became misleading. I did not edit this question because that would have affected the accepted answer. Instead, based on this meta recommendation, I wrote this clean version for better reference.

1

There are 1 best solutions below

3
On BEST ANSWER

Nope. The uniform distribution $U(0,a)$ for any $0<a<1$ has differential entropy $\log(a)$, which is negative.

If you want something that is supported on exactly $(0,1)$, you can consider the average of two independent uniforms on $(0,1)$, which follows a triangular distribution: $$f(x) = \begin{cases} 4x & x \in (0, 1/2] \\ 4(1-x) & x \in (1/2, 1).\end{cases}$$ The differential entropy is $1/2 + \log(1/2) \approx -0.19.$