The Moment Generating function of $X$, $M_X(t)$ is defined as
$$M_X(t) = E[e^{tX}] = 1 + tE[X] + \frac{t^2 E[X^2]}{2!} + ... $$
Question 1 Is the MGF a function of one or two variables (as in $t$ or $t$ and $X$)?
Question 2 I know this probably looks dumb but why would it not be $M(t,X)$ or $M_{t,X}(t,X)$?
For example in the law of the unconscious statistician $E[g(X)]$ is a function of $X$. So maybe $E[e^{tX}]$ could be thought of as $E[g(X,t)]$.
I don't think $f(x,y)$ is equivalent to $f_X(y)$ at all since one is the joint pdf and the other is the marginal pdf.
Reason I'm asking is because I'm learning about MGFs and want to understand the elementary stuff... thanks for your help I know this is not a very inspired question.
The MGF is the expectation of a function of a random variable, thus cannot depend on any realizations of that random variable--that is to say, it is not a function of $X$. It is a function of $t$, as well as any parameters of the distribution of $X$.
This is not a correct statement. Why do you think this is true?
For example, suppose $X \sim \operatorname{Bernoulli}(p)$. Then $$\Pr[X = 1] = p, \quad \Pr[X = 0] = 1-p.$$ It follows that $$M_X(t) = \operatorname{E}[e^{tX}] = e^0 \cdot \Pr[X = 0] + e^t \cdot \Pr[X = 1] = (1-p) + e^t p = (e^t - 1)p + 1.$$ Note there is no $X$ in the result, only $t$ and the parameter $p$.