Let's say $X_i$ (for $i = 1, 2, \ldots$) are independent r.v.'s that return $0$ or $1$, both with probability 0.5.
Let's say $N$ is a geometric random variable with $P(N=n) = 0.5^n$ for $n=1, 2, \ldots$
Let's define $Y$ to be equal to $\sum_{i=1}^N X_i$.
A solution to an exam problem (that I couldn't fully comprehend) first states the probability generating functions of $X_i$ and $N$ are, respectively:
$$G_{X}(t) = \frac{1}{2}+\frac{t}{2}, \qquad G_N(t) = \frac{t}{2-t}$$
That I understand. But in the next sentence they swiftly deduce that:
$$G_Y(t) = \frac{1+t}{3-t}$$
How do you do this? I still often have a bit of trouble dealing with r.v.'s like $Y$ here.
I'm looking for a quick, nice proof of this, I suppose such a proof exists, as they chose to omit it.
Thank you!
$$G_Y(t) = E(t^Y) = E(t^{\sum_{i=1}^NX_i}) = E(t^{X_1} \cdots t^{X_N}) $$
Now use conditional expectation on $N$ and independence of $X_i$ to find that
$$ G_Y(t)= E(E(t^{X_1} \cdots t^{X_N} \mid N=n)) = E\left(\left(\frac{t+1}2\right)^N\right) = \frac{t+1}{2(2 - \frac{t+1}2)} = \frac{1+t}{3-t}$$