Let $\mathbf X=(X_1,\dotsc,X_n)$ be a vector of random variables. For $\mathbf a=(a_1,\dotsc,a_n)$, a vector of non-negative integers, let $\mathbf X^{\mathbf a}$ denote the monomial $X_1^{a_1}\dotsb X_n^{a_n}$.
Assume that $\mathbf E(\mathbf X^{\mathbf a})<\infty$ for every vector $\mathbf a$ of non-negative integers. Assume further, that the joint probability distribution of $\mathbf X$ is completely determined by the set of mixed moments $\mathbf E(\mathbf X^{\mathbf a})$ as $\mathbf a$ runs over all vectors of non-negative integers.
Now suppose $\{\mathbf X^{(k)}\}$ is a sequence of vectors of random variables, so that $\mathbf X^{(k)}=(X^{(k)}_1,\dotsc, X^{(k)}_n)$ for each $k$. Suppose that, for every vector $\mathbf a$ of non-negative integers, $$ \lim_{k\to \infty} \mathbf E(\mathbf X^{(k)\mathbf a})=\mathbf E(\mathbf X^{\mathbf a}). $$
I believe it should follow that $\mathbf X^{(k)}$ converges to $\mathbf X$ in distribution. This would be a multivariate analogue of the method of moments in probability theory.
Is it true? Is there a good reference to cite for it?
E K Haviland, On the Momentum Problem for Distribution Functions of More than one Variable, American Journal of Mathematics , Jan., 1936, Vol. 58, No. 1 (Jan., 1936), pp. 164-168.
The result is found on p. 632, as Theorem 1 (The Momentum Theorem).
In fact, Haviland proves a stronger result which does not assume that the limit mixed moments determine a unique probability distribution. However, in the determinate case, his result is exactly the required result.