Are there maximum entropy distributions with fixed moments of a given order?

67 Views Asked by At

A characterization of the multivariate Gaussian distribution with a fixed mean and covariance matrix is that it is the unique probability distribution with fixed mean and covariance matrix that maximizes differential entropy. That is, the Gaussian PDF $f$ is the unique solution to the following maximization problem over probability densities:

$$\text{argmax} \{\mathbb{E}_f[\log(f(x))] : \mathbb{E}_f[x_i] = \mu_i,\; \mathbb{E}_f[x_i x_j] = \sigma_{ij}\}.$$

When there is a solution to this problem if we specify the moments of degree at most $k$ to the probability distribution? That is, when is there a solution to the problem

$$ \text{argmax} \{\mathbb{E}[\log(f(x))] : \forall \alpha,\; \mathbb{E}_f[x^{\alpha}] = \mu_{\alpha}\}. $$

Here, $\alpha = (\alpha_1, \dots, \alpha_n)$ ranges over those values where $\sum_{i=1}^n \alpha_i \le k$.

For example, it is obviously necessary that there exists a probability distribution with these given moments for there to be a solution (I've heard the question of whether or not there exists such a probability distribution is referred to as a moment problem in some contexts). Is this sufficient? Has this particular family of probability distributions been referred to in the literature before?

1

There are 1 best solutions below

0
On

In the scalar case, the solution should have the form

$$f(x)= e^{- p(x)}$$ where $p(x)$ is a polynomial of degree $k$. The $k+1$ coefficients should be given implicitly by the $k$ moments (plus the normalization condition).

This suggests that the problem has no solution if $k$ is odd (as is the case for $k=1$).