I know that certain probability distributions may be derived from the requirement that entropy be maximal along with a constraint such as fixed variance. In the case of fixed variance, for example, one finds the normal distribution. In particular, the maximisation is over the set of all (!) continuous PDFs with that fixed variance.
Now my question is, is there a similarly general derivation of the Poisson distribution as a maximum entropy distribution? E.g. fixing that mean and variance are equal and maximising entropy? I have found a couple of articles but they always seem to prove maximality on a restricted set of discrete PDFs. Is it because there is no more general maximum entropy principle for the Poisson distribution? If so, is it because the discrete case is simply more complex than the continuous one?
I guess the $k!$ comes from a permutation term.
Supose a composite system. Subsystem A has $n_a$ elements, while Subsystem B has $n_b$ elements. There are ${n_a+n_b}\choose{n_a}$ possible permutations. Consider that elements in A may be in $m_a$ states, while elements in B may be in $m_b$ states. You may think that each subsystem is a bag with a given number of pockets. There will be $$ {{n_a+n_b}\choose{n_a}} m_a^{n_a} m_b^{n_b} $$ different states for the composite system, given the partion of the elements. Considering all states have the same probability, the probability of having a patition with $n_a$ elements in A is proportional to the number of possible states for the composite system. Using the binomial theorem we have that the probability of a given partion is
$ P(n_a)={{n_a+n_b}\choose{n_a}} m_a^{n_a} m_b^{n_b}/(m_a+m_b)^{(n_a+n_b)} $
Assuming that $m_b$ grows to infinity, while $(n_a+n_b)/m_b=\mu$ remains constant, yields Poisson $$ P(n_a)=e^{-\mu}\mu^{n_b}/n_a! $$