I know that the PDF of a Gamma function is $f_{\alpha,\theta}(x)=\frac{x^{\alpha-1}e^{-x/\theta}}{\Gamma(\alpha)\theta^\alpha}$, but when calculating the MGF the following function is used
$\begin{align} E[e^{tX}] &=\int_0^\infty e^{tx}f_{\alpha,\theta}(x)\,dx \\ &=\frac{1}{\Gamma(\alpha)\theta^\alpha}\int_0^\infty e^{tx}x^{\alpha-1}e^{-x/\theta}\,dx\\\end{align}$
My question is when plugging in the $e^{tX}$ to the PDF (when using the law of the unconscious statistician), why do you not also plug it into the $x$ in the $f(x)$? Is the entire $e^{tX}$ not being treated as the $x$ in the law of the unconscious statistician formula?
Because, that's not the law of the unconscious statistician. It says $$ E(g(X)) = \int g(x)f_X(x)dx.$$ You are taking the average value of $g(X)$ (where $g(X) = e^{tX}$ in this particular case.
Maybe it's easiest to see if instead of a continuous distribution, you have a simple discrete distribution $P(X=1) = 1/2$ and $P(X=0)=1/2$ where $X$ has the fifty-fifty chance of being $1$ or $0.$ Then say you want $E(e^{X}).$ We know the average is $$ E(e^X) = \frac{1}{2}e^1 + \frac{1}{2}e^0 = \frac{e+1}{2}.$$ We can write this in the form of the LOTUS as $$ E(g(X)) = g(1)P(X=1) + g(0)P(X=0)$$ where $g(x) = e^x.$
Note this is way different from writing $$ E(g(X)) = g(1)P(g(X) = 1) + g(0) P(g(X)=0),$$ which is the analogue of what you suggested $$ E(e^{tX}) = \int e^{tx}f_X(e^{tx})dx.$$ This would make no sense... $P(g(X) = 0) = P(e^X=0) = 0$ since $e^X$ is either $e$ or $1,$ never zero.