The characteristic function is computed as follows $$ \varphi (t) = \left\langle e^{i t X} \right\rangle $$
When the right-hand side is expanded, we get $$ \varphi (t) = 1 + i t \left\langle X \right\rangle - \frac{t^2}{2} \left\langle X^2 \right\rangle - \frac{i t^3}{3!} \left\langle X^3 \right\rangle + \cdots $$
Let's say we don't really know the distribution for $X$, or its full characteristic function, we only know $\left\langle X \right\rangle$. In that case, I think, a good approximation to $\varphi$ would be (to my best knowledge) $$ \varphi (t) = \left\langle e^{i t X} \right\rangle \approx e^{i t \left\langle X \right\rangle} $$ (this yields a $\delta$-function as a distribution for $X$, which makes sense, since we only provided the mean value)
What if I know both $\left\langle X \right\rangle$ and $\left\langle X^2 \right\rangle$? Can I somehow write down something very similar keeping in mind that I only know these two moments? In this case it would probably be a Gaussian, since it already contains both $\left\langle X \right\rangle$ and $\left\langle X^2 \right\rangle$ as its parameters. What if I know first $n$ moments, is it then possible to write down a characteristic function or distribution itself that would yield these moments? What about distributions of several variables and knowing all expectation values of various combinations up to some order $n$? Thank you.