While studying a coin-toss experiment inspired by an interview question I stumbled upon a discrete random variable whose moments (after normalizing and centering to zero, that is $\mu=0,\sigma=1$.) are as follows:
\begin{eqnarray} E(X)&=&0\\ E(X^2)&=&1\\ E(X^3)&=&2\\ E(X^4)&=&9\\ E(X^5)&=&44 \end{eqnarray} ...and more generally $E(X^n)$ is the number of derangements on $n$ letters. On computing the MGF (moment generating function) I find
$$E(e^{tX})=\sum_{n=0}^\infty\frac{E(X^n)}{n!}t^n=\sum_{n=0}^\infty\frac{a_nt^n}{n!}$$
where $a_n$ is the number of derangements on $n$ letters.
Further with a few combinatorial techniques the MGF was found to be $$\frac{e^{-t}}{1-t}$$
My aim is to find a closed-form expression for the distribution function $f(x)$, just like $f(n)=\lambda^n\cdot e^{-\lambda}/n!$ for the Poisson distribution.
On looking up a comprehensive list of MGFs/families for known distributions I do not find this MGF above being any of them as listed under the heading "Examples" in the wiki resource https://en.wikipedia.org/wiki/Moment-generating_function.
I even tried to use integration by parts in the equation
$$E(e^{tX})=\sum_{x=0}^\infty e^{tx}f(x)=\frac{e^{-t}}{1-t}\quad\text{(summed over all whole numbers x)}$$
both with $u=e^{tx},v=f(x)$ and vice-versa to no avail as the terms are widely affected by the behaviour of $f$ as $x\rightarrow\pm\infty$. Could anyone throw light on how $f$ is to be found or approximated or point to some literature which helps find $f$ above? Thanks in advance.
Recall that if you have two independent random variables $X,Y$ then for the MGF's we have $$M_{X+Y}(t) = M_X(t)M_Y(t).$$ Hence if we can recognize $\frac{e^{-t}}{1-t}$ to be a product of two known MGF's, we can conclude.