Suppose the random variable $X$ is known to have MGF and its $k$-th moment is given as \begin{equation} \label{eqn:moments} m_k = (-1)^k \sum_{l=1}^k \sum_{ \ \ \ \ \ \ \ \ j_i\ge 1 \\ j_1+\cdots +j_l = k}\binom{k}{j_1 \ j_2 \ \cdots \ j_l }\binom{-2}{l}2^l \end{equation}
Once the moments are simplified and is possible to evaluate the MGF, the PDF is derived(taking the inverse of Laplacian or whatever). However, the real problem is that there is a difficulty to simplify $m_k$. The form is very analogous to the PDF of a multivariate Bernoulli distribution, but the condition $j_i\ge 1$ let just stuck in here.
Appreciate any helps.
EDIT: I have found there is a simplified version of $m_k$(see below reference) and seems using Stirling number is the best possible way. From here, is it possible to evaluate the closed-form of MGF noting that $$ M_X(t) = \sum_{k=0}^\infty m_k\frac{t^k}{k!} \ \ \ ?$$