Let $X$ be a bernoulli random variable with parameter $p\in (0,1)$. Then the moment generating function is given by $M_X(t)=1-p+pe^t$. Now suppose $(Y\mid Z=z)\sim Ber(z)$ where for example $Z$ is some distribution with values in $(0,1)$. Then we get the moment generating function for $Y\mid Z=z$ to be $1-z+ze^t$
Does it make sense to say the moment genearting function at $t$ of $Y\mid Z$ is given by $1-Z+Ze^t$? intuitively this is true, but I do not konw how to make this statement rigorous. I hope someone understands what I am trying to do and could help me to write this down mathematically correct.
Let the probability space be $(\Omega,\mathscr{A},\mu)$. We assume that $Z \in \mathcal{L}^1(\mathscr{A})$. We verify that $U_t(\omega)=1-Z(\omega)+Z(\omega)e^t$ is a version of the conditional expectation $E[e^{Xt}|\sigma(Z)](\omega)$. Clearly $U_t$ is $\sigma(Z)$-measurable and $U_t \in \mathcal{L}^1(\sigma(Z))$. Now for $C \in \sigma(Z)$ we have $$E[U_t\mathbb{I}_C]=E[(1-Z+Ze^t)\mathbb{I}_C]=P(C)-E[Z\mathbb{I}_C]+e^tE[Z\mathbb{I}_C]$$ $$\begin{aligned}E[e^{Xt}\mathbb{I}_C]&=e^tP(\{X=1\}\cap C)+P(\{X=0\}\cap C)=\\ &=e^tP(\{X=1\}\cap C)+P(C)-P(\{X=1\}\cap C)=\\ &=e^tE[\mathbb{I}_{\{X=1\}}\mathbb{I}_C]+P(C)-E[\mathbb{I}_{\{X=1\}}\mathbb{I}_C] \end{aligned}$$ By definition, $Z(\omega)=E[\mathbb{I}_{\{X=1\}}|\sigma(Z)](\omega)=P(X=1|\sigma(Z))(\omega)$, which implies that $E[\mathbb{I}_{\{X=1\}}\mathbb{I}_C]=E[Z\mathbb{I}_C]\,\forall C \in \sigma(Z)$. Therefore $$U_t=E[e^{Xt}|\sigma(Z)],\,\,\mu\textrm{-a.e.}$$