Finding distribution to maximize entropy of a random variable subject to fixed mean

624 Views Asked by At

I saw this as a past paper question:

$X$ is a random variable taking values in the positive integers, $\mathbb Z^{\geq 0}$, and has fixed mean $\mathbb E(X) = m$. Find the distribution of $X$ when its entropy $H(X)$ is maximized.

The question suggests using Gibbs' inequality as a hint, however, I cannot find a way to apply it appropriately.

Any help is greatly appreciated!

1

There are 1 best solutions below

0
On BEST ANSWER

Thanks to the hint from @stochasticboy321

By Gibb's inequality, we have:

$H(X) = -\sum{p_i log(p_i)} \leq -\sum{p_i log(q_i)}$ for any probability mass function $q_i$ on $\mathbb Z^{\geq 0}$, and with equality iff $p=q$.

We have another constraint as: $\mathbb E(X) = \sum{i*p_i} = m$ by the question statement. A possible distribution on the natural numbers, $q_i$, that takes advantage of the known quantity of $\mathbb E(X)$ would be the geometric distribution.

So suppose $q_i = x(1-x)^i$, then $log(q_i) = log(x) + i*log(1-x)$. Thus:

$H(X) \leq -log(x) - m*log(1-x)$

For the best bound on $H(X)$ we minimize the right-hand side of the equation, and it is minimized at $x = 1/(m+1)$. By definition of geometric distributions, $q$ has a mean of $m$ which is what we want. So $H(X)$ is maximized when $X \sim p = Geom(1/(m+1))$.