Given a discrete random variable $X$ and $E[X] = 5000$, I want to create a probability distribution of X using the principle of maximum entropy. $X$ can take values between $0$ and some unknown finite upper bound. I have once observed $X$ take on the value $15,000$, but our background information on the underlying process suggests there is no good reason to believe $15,000$ is the upper bound.
Jaynes (2003) states that the sample space needs to be known for MaxEnt methods, so is there any reasonable way to estimate the upper bound? There is no other information available to work with.
If we know the maximum allowed value, say $N$, then the maximum entropy distribution is a truncated exponential: $p(x)= b_N a_N^{-x}$, where $a_N,b_N$ are given by the solutions of $\sum_{x=0}^N p(x)=1$ and $\sum_{x=0}^N x p(x)= \mu$
Now, if we take some $N_2> N_1$, the resulting maximum entropies verify $H_2>H_1$. (This can be seen from the above). Hence the entropy of the maxent distribution increases with $N$.
Hence, if we consider the set of distributions corresponding to all (finite) $N$, there is no maximum entropy distribution. There is a supremum for the entropy, which corresponds to $N=\infty$: this is a (non truncated) exponential. None of the finite-support distributions can achieve this, but we can get arbitrarily close by choosing bigger and bigger $N$.