There's a theorem which states that the moments, i.e. $M_n = \mathbb{E}\left(X^n\right)$, of a distribution uniquely identify the distribution if $$ R := \left(\limsup_{n\to\infty} \frac{1}{n}\sqrt[n]{M_n}\right)^{-1} > 0 \text{.} $$
The proof for this that I'm looking at argues this by observing that this condition implies the convergence of the following series expansion of the moment-generating function $$ M(t) = \sum_{n=0}^\infty M_n \frac{t^n}{n!} $$ on $(-R,R)$ (note that $\sqrt[n]{n!} \leq n$). It then asserts that since $\phi(t) = M(it)$, where $\phi$ is the characteristic function and $M$ the moment-generating function of some random variable $X$, this defines the characteristik function, which is known to define the distribution uniquely.
My problems with that reasoning is that as far as I can see $M$ only defines $\phi$ on $M$'s radius of convergence, i.e. $(-R,R)$. Who's to say that there couldn't be two different extensions $\phi_1$,$\phi_2$ which both agree with $M(it)$ on $(-R,R)$? If I knew $\phi$ to be analytically extendable to the simply connected region $$ \{t + i\delta\,:\, |\delta| < \epsilon, t \in \mathbb{R}\} $$ (i.e., a stripe around the real axis), then such an extension would be unique, and that would preclude the possibility of $\phi_1 \neq \phi_2$. But I don't see why such an extension would need to exist. Since I'm starting from (except for the growth condition) arbitrary moments, a priori I don't even know that there's any distribution with those moments. And even if I did, I'd only know that there's some $\phi'$ which agrees with $\phi$ on $[0,R)$, not necessarily on $[0,R)\times(-\epsilon,\epsilon)$.
Can anyone shed some light on this? I have the feeling that I'm overlooking some crucial property of $\phi$, but I can't seem to find it.
I think I've figured this out.
Let $F$ be a possible distribution function of $X$. We know that the moment-generating function $M(t) = \mathbb{E_F}e^{Xt}$ is uniquely defined and analytic on a ball with radius $R$ around zero. Then characteristic function $\phi_F(z) = \mathbb{E_F}e^{iXz}$ then exists on a stripe $S$ with width $R$ around the real axis, because if $z = t - ir$ with $|r| < R$ then $$ \left|\phi_F(z)\right| = \left|\int_\mathbb{R} e^{ix(t-ir)} \,dF \right| \leq \int_\mathbb{R} \left|e^{ix(t-ir)}\right| \,dF \leq \int_\mathbb{R} e^{xr} \,dF = M(r) < \infty \text{.} $$
In other words, $x \to e^{ixz}$ is $F$-integrable for all $z\in S$. Now since $z \to e^{ixz}$ is analytic for all $x \in \mathbb{R}$ and since $(z,x) \to e^{ixz}$ is continuous and thus measurable on $S\times\mathbb{R}$, it follows that $\phi_F(z)$ is analytic on $S$ (see Sufficient conditions for $z \to \int_\mathbb{R} h(z,x) \,d\mu(x)$ to be analytic).
Since two different $\phi_{F_1}$, $\phi_{F_2}$ agree on the ball with radius $R$ around zero, by the uniqueness of analytic extensions to simply connected regions they must in fact agree on $S$, hence $F_1 = F_2$.
The approach Davide Giraudo suggested works too, of course, but the computations get a bit messier. This approach avoids avoids all nicely by referring to the theorem about analytic extensions.