I read from the MGB stats textbook which says something about "the problem of moments", as follows: "In general, a sequence of moments μ1,μ2..,μn,... does not determine a unique distribution function;..., However, if the moment generating function of a random variable did exist, then this moment generating function did uniquely determine the corresponding distribution function "
It is hard for me to see the difference between these two concepts(sequence of moments VS moment generating function). I've looked through several posts about this topic, and I know that someone did come up with a counterexample of a particular density family with the same sequence of moments:
I also read about the proof of the uniqueness of moment generating function.
But doesn't that sequence of moments define a moment generating function? As m(t) can be written as

Can someone fill the gap for me? Thanks so much!

The problem is that the series for the moment generating function $m(t)$ might not converge anywhere except $t=0$. The actual result is that if this series has positive radius of convergence, it uniquely determines the distribution.
In your example, the moments are $\mu_n = (4n+3)!/6$, and it's easy to see using the Ratio Test that the radius of convergence is $0$.