The Poisson distribution (with $\lambda=1$) has probability mass function $\frac{e^{-1}}{k!}$ where $k\in\{0,1,2,\cdots\}$. Its moments are the Bell numbers $B_n$, which count the possible partitions of a set with $n$ elements.
If I start with the Bell numbers and I suppose that they are the moments of some distribution, then how can I arrive at the Poisson distribution?
Let me give a similar problem that I am able to solve. Given moments $\mu_n=n!$, I can use the formula $\hat{w}(\xi)=\sum_{n=0}^{\infty}\frac{(-2\pi i)^n \mu_n}{n!}\xi^n$. This yields
$$\hat{w}(\xi)=\sum_{n=0}^{\infty}\frac{(-2\pi i)^n n!}{n!}\xi^n=\sum_{n=0}^{\infty}(-2\pi i\xi)^n=\frac{1}{1-(-2\pi i \xi)}=\frac{1}{1+2\pi i\xi}$$
Then I can apply $w(x)=\int_{-\infty}^{\infty}\hat{w}(\xi)e^{2\pi i \xi x}d\xi$. The integral is solved here. The result is $w(x)=e^{-x}H(x)$ where $H(x)$ is the Heaviside function.
I want to do something similar to go from the Bell numbers to the Poisson distribution. In my example above, I make use of the Fourier transform. But I don't know what would work in this case.
I will try to answer my own question. I found this page helpful: What is the Laplace Transform of the Poisson Distribution?
I know that the generating function of the Bell numbers is $e^{e^t-1}$, or more generally, if each part of a partition contributes weight $\lambda$, it is $e^{\lambda(e^t-1)}$. This is the moment generating function $M_X(t)$, and as explained at the Wikipedia page, it is related to the Laplace transform of the probability density function $f_X(x)$ as follows: $M_X(t)=\mathcal{L}\{f_x\}(-t)$. So we could take the inverse Laplace transform. But we need a version that works with discrete functions. That is the Z-transform where $z$ is a complex number but we can write $z=Ae^{it}$ so it is like the Laplace transform. We can proceed:
$e^{\lambda(e^t-1)}=e^{-\lambda}e^{e^t\lambda}=e^{-\lambda}\sum_{x=0}^{\infty}\frac{(e^t\lambda)^x}{x!}= e^{-\lambda}\sum_{x=0}^{\infty}e^{tx}\frac{\lambda^x}{x!} = \sum_{x=0}^{\infty}e^{tx}e^{-\lambda}\frac{\lambda^x}{x!} = \sum_{x=0}^{\infty}e^{tx}f_X(x) $
where $f_X(x)=e^{-\lambda}\frac{\lambda^x}{x!}$ and $x\in\{0,1,2,\dots\}$.
So basically that gives the answer I want but I don't understand in what sense the transform is sensible, valid, meaningful, correct or not. I appreciate comments!
Edit: Here Robert Murray led me to understand what I want to say.
$\sum_{x=0}^{\infty}e^{tx}f_X(x) = \sum_{x=0}^{\infty}e^{tx}p_x = \mathbb{E}(e^{Xt})$
where $X$ is any discrete random variable, which may take value $x\in\{0,1,2,\dots\}$ in the domain of this probability mass function $f_X(x)=p_x=e^{-\lambda}\frac{\lambda^x}{x!}$. So we started with the generating function of the Bell numbers, and we have arrived at an explicit form of the Poisson distribution to express the expected value of $e^{Xt}$, which is a summary of all of the moments.
What have we achieved? We know that the Bell numbers $B_t(\lambda)$ count the partitions of a set of length $t$ where the weight $\lambda$ is given to each part. So now we can try to think of the Poisson distribution from this combinatorial perspective. The Poisson distribution is the norm for the orthogonality of the Charlier polynomials, whose coefficients also have a combinatorial interpretation. Similarly, we can do so for other orthogonal Sheffer polynomials, which are important in solving Schroedinger's equation. So this gives a new way of investigating quantum physics, as I do here.
Robert, thank you! I needed your help! I learned a lot!