Let $X$ be a random variable that takes values in the discrete set $\{a_1, \dots, a_m\}$ with probabilities $\{p_1, \dots, p_m\}$ such that $\mathbb{E}[X] = 0$. The sequence of random variables $\{X_i\}_{i=1}^{\infty}$ is such that the $X_i$'s are independent and identically distributed according to $X$. If we define a sequence of random variables as $\displaystyle Y_n = \frac{1}{\sigma\sqrt{n}}\sum_{i = 1}^n X_i$ for $n \geq 1$ where $\mathbb{E}[X^2] = \sigma^2$, then the Central Limit theorem states that $Y_n$ converges to a standard normal variable in distribution. In fact, we can extend this convergence for every bounded continuous function $f$ and show that $\mathbb{E}[f(Y_n)]$ converges to $\mathbb{E}[f(Z)]$ in distribution, where $Z$ is a standard normal variable.
Now, under the assumption of finite third moment of $X$, which is true in our case, Berry-Esseen theorem gives bounds on the rate of convergence of $Y_n$ to $Z$. It states that $|F_n(x) - \Phi(x)| \leq C n^{-1/2}$, where $F_n$ and $\Phi$ are the cumulative distribution functions of $Y_n$ and the standard normal random variable respectively. The constant $C$ depends on the moments of $X$.
I am interested in obtaining a similar bound for Moment Generating Function of $Y_n$, that is on the difference $|\mathbb{E}[e^{tY_n}] - \mathbb{E}[e^{tZ}]|$. This previous question establishes that the Moment Generating Functions do converge, which also follows from the claim for continuous functions. The answer in that question also hints on how to obtain the rates. However, I was unable to obtain the bounds using the suggested method.
It seems to me this is something quite intuitive and there would be some existing results in the literature that derive such bounds. However, I have not been able to find any such result. Any solutions or references will be highly appreciated. Thanks.