Which is the continous probabilty density function that, given both standard deviation and a finite support, maximizes entropy?

42 Views Asked by At

It is fairly known that acorrding to the Entropy definition: $$ H=-\int p(x) \log (p(x)) dx $$ The Gaussian or Normal distribution maximizes the value of $H$. Also, if we have a finite Interval with a continous p.d.f. over it, the "flat" or homogeneus distribution gives the bigger Entropy. What happens if we use both restrictions? If we have both a given interval in which the p.d.f must have its support, and the value of the standard deviation is given? Which function maximizes H over this? Moreover, does the answer converges to the Gaussia as the standard deviation becomes small in comparition to the support, and to the flat if the standard deviation becomes the half width of the support?

1

There are 1 best solutions below

0
On

Assuming, WLOG that the support is $[-1,1]$, Lagrange multipliers give $p(x) = \alpha e ^{ \beta x^2}$ where $\alpha, \beta$ are determined by the two restrictions $\int_{-1}^1 p(x)=1$ and $\int_{-1}^1 x^2 p(x)=\sigma^2$.

Notice however that the variance cannot be arbitrary (in this case, $0\le\sigma^2 \le 1$).

The probability, then, corresponds to a truncated gaussian if $\sigma^2 < \sigma^2_0$, a uniform if $\sigma^2 = \sigma^2_0$, and an inverse truncated gaussian if $ \sigma^2_0 <\sigma^2 <1$.