Are there discrete probability distributions parametrized by the entropy?

54 Views Asked by At

Is there a family of probability distributions for discrete random variables where I can specify the entropy $h$ of the distribution as a parameter of the distribution, and have other parameters independently specify the shape of the distribution? I imagine $h$ would be kind of like the temperature parameter of a softmax, except I want the entropy to only depend on the temperature, not on the logits.

If I compute $H(X)$ on this distribution, I should get back the $h$ I specified no matter the values of the other parameters.

If this doesn't exist, maybe with $h$ as an upper bound of the entropy instead then?

1

There are 1 best solutions below

2
On

Sure. There are lots of them. Almost any distribution can be parametrized by entropy (and possibly other parameters). There is not just one way to parametrize any given distribution; there are typically many ways.

For instance, consider the Bernoulli distribution. The most common way to parametrize it is by $p$, the probability of 1. Under that parametrization, its entropy is $h_2(p)$, where $h_2$ is the binary entropy function. But you could also parametrize it by $y$, the entropy, and by a single boolean value that indicates whether $p<0.5$ or $p>0.5$. Then a Bernoulli distribution with (entropy) parameter $h$ would take the value 1 with probability $h_2^{-1}(y)$ and the value 0 with probability $1-h_2^{-1}(y)$. You need the boolean because $h_2{-1}(y)$ has two possible values, so the boolean indicates whether to take the smaller or larger of the two values.

You can do similar transformations with other discrete distributions as well. Some distributions require multiple parameters; they can be reparametrized so one of the parameters is the entropy, and then there are other parameters as well.

As another example, the entropy of the Poisson distribution with rate (parameter) $\lambda$ is given by

$$g(\lambda) = \lambda(1-\log(\lambda)) + e^{-\lambda} \sum_{k=0}^\infty {\lambda^k \log(k!) \over k!}.$$

Therefore, we can reparametrize it, as consider it to be a distribution with parameter $y$, the entropy. Then a Poisson distribution with (entropy) parameter $y$ would be equivalent to a Poisson distribution with (rate) parameter $\lambda=g^{-1}(y)$. Since $g$ is a strictly monotonically increasing function of $\lambda$, the inverse is always well-defined, and we don't need any additional parameters.

I suspect this might be an instance of a XY problem...