Entropy of $\max(0, \mathrm{uniform}(-1, 1))$

68 Views Asked by At

I'm trying to figure out how to deal with distributions that are mixtures of discrete and continuous.

A simple example is max(0, uniform(-1, 1)) -- draw a (real) number uniformly between $[-1, 1]$ and if it's less than zero, then set it to zero.

I'd like some way of measuring the information or entropy of these distributions, but so far the obvious things don't work.

Discretizing and taking the limit of infinitesimal slices yields infinite entropy (since you're able to distinguish an infinite number of separate events).

Differential Entropy and Limiting Density of Discrete Points don't work because it doesn't have a proper probability density function (there isn't a relative probability between getting $0$ and getting $0.5$ -- it's infinity).

Any pointers on where/how to get started would be appreciated. Maybe this sort of thing hasn't been done before? (Would bet against this, though)

1

There are 1 best solutions below

0
On BEST ANSWER

Your density is $f_X(x) = \frac12 \delta(x) + \frac12 \mathbb{1}_{0<x<1}$

As you mentioned, the true (Shannon) entropy is infinite. The differential can be computed by several arguments. For simplicity, lets take $\delta(x)$ as the limit of a uniform in $[-a,0]$ with $a \to 0^+$. Then

$$\begin{align} -\int f_X(x)\log f_X(x) &=\int_{-a}^0 \frac{1}{2a}\log (2a)dx+\int_{0}^1 \frac{1}{2}\log ({2})dx \\ &= \frac{1}{2}\log ({2a}) + \frac{1}{2}\log ({2}) \end{align} $$

Taking the limit as $a \to 0^+$, we conclude that the differential entropy tends to $-\infty$ , as a purely discrete distribution.

Hence, one might conclude that a density that has any "discrete component" (more precisely: $P(X=b)=\epsilon > 0$ for some $b$) has negative infinite differential entropy. Be aware 1: it could still happen that the "continuous component" has positive infinite entropy... and then to compute the entropy could be an interesting problem. Be aware 2: there are also "purely continuous" variables that have negative infinite differential entropy