This may be a very stupid question but I just want some clarification. Take for example a geometrically distributed random variable $Y$ with PMF $p_Y(y) = p(1-p)^{y-1}$ and MGF $\mathbb{E}[e^{tY}] = \frac{pe^{t}}{1-(1-p)e^t}$. Now say I want to find $\mathbb{E}[g(t)^Y]$, can I just simply replace $e^t$ in the aforementioned MGF with the function $g(t)$? E.g., $\mathbb{E}[\left(2^t + 3t^2 + \log(t)\right)^Y] = \frac{p\left(2^t + 3t^2 + \log(t)\right)}{1-(1-p)\left(2^t + 3t^2 + \log(t)\right)}$.
Can I do this for any generic random variable $Y$?
For this random variable, $\mathbb{E} [g(t)^Y] = \sum_{y=1}^{\infty} g(t)^y p(1-p)^{y-1}$
So we get $\mathbb{E} [g(t)^Y] = pg(t)\sum_{y=1}^{\infty} g(t)^{y-1} (1-p)^{y-1}$, and now we need to assume that $|g(t)(1-p)| <1$ But then by the formula for the sum of a geometric series we do get
$\mathbb{E} [g(t)^Y] = \frac {pg(t)} {1-(1-p)g(t)}$ So in your example, $g(t) = 2^t + 3t^2 + log(t)$, your expression for the expectation is true ONLY when $|g(t)(1-p)|<1$.
This region for t is easy to work out when $g(t) = e^t$ because $e^t$ and $(1-p)$ are non negative so we simply have $e^t (1-p) <1 \therefore e^t < \frac{1}{1-p}$ so as the exponential function is monotone increasing we just get that our region for the moment generating function to exist is $(-\infty, -$log$(1-p))$ (If you didn't already know this)
For a generic random variable we will always get an expression of the form $\mathbb{E} [g(t)^Y] = \sum_{y=1}^{\infty} g(t)^{y} f_{Y} (y)$ where $f_{Y} (y)$ is the density function of the random variable. It is nice that in this case the g(t) just gets summed in the same way as the $f_{Y} (y)$ (as a geometric series) but in general for other distributions this won't happen.
You might like to try a poisson random variable to convince yourself of this, take$X \sim Po(2)$ and you won't be able to just take the $g(t)$ out of the sum like you would be able to with $e$.