I'm self studying probability, specifically generating functions. I'm searching for the relation between conditional probabilities and probably generating functions in order to justify writing the probability generating function of the following problem.
Suppose I have a coin with probability $p$ of landing heads and $(1-p)$ of tails. Using $X=1$ to indicate the coin flipped on heads and $X = 0$ for tails, the probability generating function is
$$G_\text{coin}(s) = (1-p) + ps $$
Now suppose I have two dice. Dice A is a fair three sided dice, and Dice B is a fair 6 sided dice. The PGFs are,
$$G_\text{Dice A}(s) = \frac{1}{3}(s + s^2 + s^3)$$ $$G_\text{Dice B}(s) = \frac{1}{6}(s + s^2 + s^3 + s^4 + s^5 + s^6)$$
Now the problem is the following. I flip the coin, if it lands tails I roll dice A, if it is heads I roll dice B. What is the PGF for the number on the dice for this process?
I expect it will be $$ G_\text{coin, then dice}(s) = (1-p) G_\text{Dice A}(s) + p G_\text{Dice B}(s)$$
But I would like to understand the general property of why this is true.
Let $Y$ be the result of the dice roll, then $Y=XY_A + (1-X)Y_B$ where $Y_A$ is uniformly distributed over $\{1,2,3\}$ and $Y_B$ is uniformly distributed over $\{1,2,3,4,5,6\}$. We compute \begin{align} G_Y &= \mathbb E[s^Y]\\ &=\mathbb E\left[s^{XY_A + (1-X)Y_B}\right]\\ &=\mathbb E\left[s^{XY_A}s^{(1-X)Y_B}\right]\\ &=\mathbb E\left[s^{XY_A}\right]\mathbb E\left[s^{(1-X)Y_B}\right]\\ &=\mathbb E\left[s^{XY_A}\mid X=0\right]\mathbb P(X=0)+\mathbb E\left[s^{XY_A}\mid X=1\right]\mathbb P(X=1)+\\& \quad \ \mathbb E\left[s^{(1-X)Y_B}\mid X=0\right]\mathbb P(X=0) +\mathbb E\left[s^{(1-X)Y_B}\mid X=1\right]\mathbb P(X=1)\\ &= 1\cdot(1-p) + pG_{Y_A} + (1-p) G_{Y_B} + 1\cdot p\\ &= pG_{Y_A} + (1-p)G_{Y_B}, \end{align} as was surmised.