Suppose that X and Y are random variables both taking possible values in {$0, 1, 2 ...$}. Further, suppose that X and Y have the same mgf for all t around $0$: Then it holds: $$\sum_{j=0}^{\infty}e^{tj}f_X(j)-\sum_{j=0}^{\infty}e^{tj}f_Y(j)=0$$
$$\Rightarrow\sum_{j=0}^{\infty}e^{tj}[f_X(j)-f_Y(j)]=0$$ $$\Rightarrow \sum_{j=0}^{\infty}e^{tj}c_j=0 $$ with $c_j:=f_X(j)-f_Y(j)$.
Is there a way to justify that $c_j=0$ for all $j \in \{0,1,...\}$
The series $g(z) = \mathbb E[z^X] = \sum_{j=0}^\infty z^j f_X(j)$ converges absolutely to an analytic function for $|z| < 1$, and the coefficients $f_X(j)$ can be obtainethbbd from the values of $g$ and its derivatives at $z=0$:
$$ f_X(j) = \frac{g^{(j)}(0)}{j!} $$
$g(z)$ for $z \ne 0$ can be obtained from the moment generating function: $$ g(e^{-t}) = \mathbb E[e^{-tX}] = \sum_{j=0}^\infty e^{-tj} f_X(j)$$
So the moment generating function determines $g$, and that determines the distribution.