I have $N_{t}$ governed by a Compound Poisson$(t\lambda$) distribution with expected value $t\lambda$ and a sequence of $X_{i}$ where each $X_{i}$ is independent of $N_{t}$. If $Y = \sum^{N_{t}}_{i=1} X_{i}$, how can I find the mgf and first moment of $Y$?
I am used to the continuous case where I can integrate and differentiate, but I don't think I can apply that here due to the discrete nature of this distribution.
First off, saying "$N_t$ governed by a compound $\mathsf{Poisson}(t\lambda)$ distribution" and defining $Y=\sum_{i=1}^{N_t} X_i$ is a bit confusing - I believe what you meant is that there is a given value $\lambda>0$, and for each $t>0$ you define the random variable $$ Y_t = \sum_{i=1}^{N_t} X_i, $$ so that $\{Y_t : t>0\}$ is a stochastic process - that is, a collection of random variables indexed on time - where $Y_t$ has compound $\mathsf{Poisson}(t\lambda)$ distribution.
In this case, assuming the $X_i$ are i.i.d., the expected value of $Y_t$ can be computed using the law of total expectation: \begin{align} \mathbb E[Y_t] &= \mathbb E[\mathbb E[Y_t\mid N_t]]\\ &= \mathbb E[N_t\mathbb E[X_1]]\\ &= \mathbb E[N_t]\mathbb E[X_1]\\ &= t\lambda\mathbb E[X_1]. \end{align} The moment-generating function of $Y_t$ is given by \begin{align} M_{Y_t}(\theta) :&= \mathbb E[\exp(\theta Y_t)]\\ &= \mathbb E\left[\exp\left(\theta\sum_{i=1}^{N_t} X_i\right) \right]\\ &= \mathbb E\left[\mathbb E\left[\exp\left(\theta X_i\right)^{N_t}\mid N_t \right]\right]\\ &= \mathbb E\left[M_{X_1}(\theta)^{N_t} \right]. \end{align} Recall that for a random variable $W$ with $\mathsf{Poisson}(\mu)$ distribution, the probability-generating function of $W$ is \begin{align} \mathbb E[s^W] &= \sum_{k=0}^\infty \mathbb P(W=k)\cdot s^k\\ &= \sum_{k=0}^\infty e^{-\mu}\frac{\mu^k}{k!}\cdot s^k\\ &= e^{-\mu}\sum_{k=0}^\infty \frac{(s\mu)^k}{k!}\\ &= e^{-\mu} e^{s\mu}\\ &= e^{\mu(s - 1)}. \end{align} It follows then that the probability-generating function of $N_t$ is $e^{t\lambda(s -1)}$, and so $$ M_{Y_t}(\theta) = e^{t\lambda (M_{X_1}(\theta)-1)}. $$ For example, if $X_1\sim\mathsf{Ber}(p)$ then $$ M_{X_1}(\theta) = e^{\theta\cdot 0}(1-p) + e^{\theta\cdot 1}p = 1 - p + pe^{\theta}, $$ in which case \begin{align} M_{Y_t}(\theta) &= e^{t\lambda p\left(1-e^\theta\right)}. \end{align}