I know that when two random variables $X$ and $Y$ are independent, then the MGF of $X+Y$ is $$ MGF_{X+Y}(t) = \mathbb{E}[e^{t(X + Y)}] = \mathbb{E}[e^{tX}e^{tY}] = MGF_{X}(t)MGF_{Y}(t) $$
However, is the reverse true? If we know that $MGF_{X+Y}(t) = MGF_{X}(t)MGF_{Y}(t)$, then are $X$ and $Y$ necessarily independent?
No, it's not. Let $(X,Y)\in\mathbb R^2$, further let $\mu$ be the law of $X$, let $\nu$ be the law of $Y$, let $\mu\otimes\nu$ be the product measure of $\mu$ and $\nu$, and let $(X',Y')$ have law $\mu\otimes\nu$. This means that $X'$ is a copy of $X$, $Y'$ is a copy of $Y$, but $X'$ and $Y'$ are independent. Let us assume for a moment that $X'+Y'$ has the same law as $X+Y$. Then, since all three distributions coincide, the equality for the moment generating functions holds (and any other functions derived from the distributions).
So, if we can find any dependent $(X,Y)$ such that $X'+Y'$ and $X+Y$ have the same law, we're done. Of course, we start with finite supports. Say, we have $n>1$ elements, the support $\mathcal S=\mathbb Z\cap[1,n]$, and the marginal probabilities $p,q\in(0,1)^n$, of course with $\sum_i p_i=\sum_i q_i=1$. This determines $X$, $Y$, but also $(X',Y')$ (given by the product measure) and hence $X'+Y'$. Since it's really easy, we compute the probabilities. For this purpose we set $p_i=q_i=0$ for $i\in\mathbb Z\setminus\mathcal S$ and notice that $$s_i=\mathbb P(X'+Y'=i)=\sum_{j=-\infty}^{\infty}p_jq_{i-j}\,,\,i\in\mathbb Z\cap[2,2n],$$ is the convolution of $p$ and $q$. Now, what probabilities $P\in[0,1]^{\mathcal S\times\mathcal S}$ for $(X,Y)$ can we come up with such that $X$ matches $p$, $Y$ matches $q$ and $X+Y$ matches $s$? To be precise, $P$ has to solve \begin{aligned} \sum_{j=1}^nP_{i,j}&=p_i, & i&\in\mathcal S,\\ \sum_{i=1}^nP_{i,j}&=q_j, & j&\in\mathcal S,\\ \sum_{j=-\infty}^{\infty}P_{j,i-j}&=s_i, & i&\in\mathbb Z\cap[2,2n], \end{aligned} where we also let $P_{i,j}=0$ for $(i,j)\in\mathbb Z^2\setminus\mathcal S^2$. This means that the $n^2$ variables $P_{i,j}$ have to solve the $n+n+2n-1=4n-1$ equations. By construction, we know that there exists a solution, given by $P_{i,j}=p_iq_j$, i.e. if we assume independence. While it seems that we don't have too much flexibility for $n<4$ (because then $n^2<4n-1$), the situation for $n\ge 4$ looks promising.
First, recall the boundary conditions $0\le P_{i,j}\le 1$, and of course $\sum_{i,j}P_{i,j}=1$. But any $P$ with $P_{i,j}\ge 0$ (for all $i,j$) that solves the system of equations will automatically fulfil $\sum_{i,j}P_{i,j}=\sum_ip_i=1$, which further implies $P_{i,j}\le 1$ due to non-negativity. Now, notice that we have $p_iq_j>0$ for the canonical solution (since we assumed $p_i,q_j>0$). Further, remember from linear algebra that the system of equations over $\mathbb R^{n\times n}$ is solved by an affine subspace $\mathcal A$ (with base $(p_iq_j)_{i,j}$) of dimension at least $n^2-4n+1>0$. Now, any law $P\in\mathcal A\cap[0,\infty)^{n\times n}$ solves this system, but only one of them is a product measure.
If you prefer a concrete example, just take $n=4$, set $p_i=q_j=1/4$ and follow the steps above to determine masses $P_{i,j}$ (that are not all $1/16$ :-).
So, to be very clear, we have not only found many examples of pairs such that the equality for the mgf's still holds, we have found many examples of pairs with fixed marginal distributions (sometimes called couplings) and fixed distribution of the sum. On top, we have seen how this reduces to basic linear algebra. Notice that the same can be done with other linear combinations (in particular the expectation).