Given Bell numbers as moments, derive the Poisson distribution

78 Views Asked by At

The Poisson distribution (with $\lambda=1$) has probability mass function $\frac{e^{-1}}{k!}$ where $k\in\{0,1,2,\cdots\}$. Its moments are the Bell numbers $B_n$, which count the possible partitions of a set with $n$ elements.

If I start with the Bell numbers and I suppose that they are the moments of some distribution, then how can I arrive at the Poisson distribution?

Let me give a similar problem that I am able to solve. Given moments $\mu_n=n!$, I can use the formula $\hat{w}(\xi)=\sum_{n=0}^{\infty}\frac{(-2\pi i)^n \mu_n}{n!}\xi^n$. This yields

$$\hat{w}(\xi)=\sum_{n=0}^{\infty}\frac{(-2\pi i)^n n!}{n!}\xi^n=\sum_{n=0}^{\infty}(-2\pi i\xi)^n=\frac{1}{1-(-2\pi i \xi)}=\frac{1}{1+2\pi i\xi}$$

Then I can apply $w(x)=\int_{-\infty}^{\infty}\hat{w}(\xi)e^{2\pi i \xi x}d\xi$. The integral is solved here. The result is $w(x)=e^{-x}H(x)$ where $H(x)$ is the Heaviside function.

I want to do something similar to go from the Bell numbers to the Poisson distribution. In my example above, I make use of the Fourier transform. But I don't know what would work in this case.

2

There are 2 best solutions below

1
On BEST ANSWER

I will try to answer my own question. I found this page helpful: What is the Laplace Transform of the Poisson Distribution?

I know that the generating function of the Bell numbers is $e^{e^t-1}$, or more generally, if each part of a partition contributes weight $\lambda$, it is $e^{\lambda(e^t-1)}$. This is the moment generating function $M_X(t)$, and as explained at the Wikipedia page, it is related to the Laplace transform of the probability density function $f_X(x)$ as follows: $M_X(t)=\mathcal{L}\{f_x\}(-t)$. So we could take the inverse Laplace transform. But we need a version that works with discrete functions. That is the Z-transform where $z$ is a complex number but we can write $z=Ae^{it}$ so it is like the Laplace transform. We can proceed:

$e^{\lambda(e^t-1)}=e^{-\lambda}e^{e^t\lambda}=e^{-\lambda}\sum_{x=0}^{\infty}\frac{(e^t\lambda)^x}{x!}= e^{-\lambda}\sum_{x=0}^{\infty}e^{tx}\frac{\lambda^x}{x!} = \sum_{x=0}^{\infty}e^{tx}e^{-\lambda}\frac{\lambda^x}{x!} = \sum_{x=0}^{\infty}e^{tx}f_X(x) $

where $f_X(x)=e^{-\lambda}\frac{\lambda^x}{x!}$ and $x\in\{0,1,2,\dots\}$.

So basically that gives the answer I want but I don't understand in what sense the transform is sensible, valid, meaningful, correct or not. I appreciate comments!

Edit: Here Robert Murray led me to understand what I want to say.

$\sum_{x=0}^{\infty}e^{tx}f_X(x) = \sum_{x=0}^{\infty}e^{tx}p_x = \mathbb{E}(e^{Xt})$

where $X$ is any discrete random variable, which may take value $x\in\{0,1,2,\dots\}$ in the domain of this probability mass function $f_X(x)=p_x=e^{-\lambda}\frac{\lambda^x}{x!}$. So we started with the generating function of the Bell numbers, and we have arrived at an explicit form of the Poisson distribution to express the expected value of $e^{Xt}$, which is a summary of all of the moments.

What have we achieved? We know that the Bell numbers $B_t(\lambda)$ count the partitions of a set of length $t$ where the weight $\lambda$ is given to each part. So now we can try to think of the Poisson distribution from this combinatorial perspective. The Poisson distribution is the norm for the orthogonality of the Charlier polynomials, whose coefficients also have a combinatorial interpretation. Similarly, we can do so for other orthogonal Sheffer polynomials, which are important in solving Schroedinger's equation. So this gives a new way of investigating quantum physics, as I do here.

Robert, thank you! I needed your help! I learned a lot!

11
On

The exponential Generating function for the bell numbers is: $$B(x) = e^{e^{x}-1}$$ You may recognize this as the moment generating function for the poisson distribution when $\lambda = 1$. Let me know if you want more hints.

Edit: You have severely overcomplicated this. The moment generating function of a random varible $x$ is defined as: $$M_x(t) = \mathbb{E}[e^X]$$ If there exists a $\delta$ such that the function is finite for all $t \in (-\delta,\delta)$. Then, we only care about the values of the $n^{th}$ derivative with respect to $t$ when $t = 0$, which will give us our $n^{th}$ moment. In other words: $$\mathbb{E}[X^n] = \frac{d^n}{dt^n} M_X(t)\big|_{t=0}$$ We also know that an Exponential Generating Function can give us the $n^{th}$ term of the series by taking the $n^{th}$ derivative and setting $x=0$, so: $$B_n = \frac{d^n}{dx^n} B(x)\big|_{x=0}$$ Hence, we can conclude that the moment generating function for our random variable $X$ is: $$M_x(t) = e^{e^{t}-1}$$ Because the $n^{th}$ derivative at $t=0$ gives us the $n^{th}$ moment and also the $n^{th}$ bell number. And if two random variables, X and $Y \thicksim \text{Poi}(1)$ have the same moment generating function, they have the same distribution.