I am wanting to understand a particular argument, set as an exercise in the mentioned book.
So we have a sequence of Bernoulli's $X_1, X_2, ...$ all with the same success probability $p$, failure probability $q$ and $S_n = \sum_i X_i$
The task is to compute $\mathbb{E} z^{S_n}$ and use that to determine $\mathbb{P}\{S_n=k\}$
Now there is a solution to this, which shows why (for any $z \in \mathbb{R}$)
$\mathbb{E} z^{S_n} = \mathbb{E}z^{X_1} ... \mathbb{E}z^{X_n} = (q + pz)^n = \sum_{k=0}^n {n \choose k} p^k q^{n-k} z^k $
The argument that follows is that this implies that the distribution of $S_n$ will be a counting measure, placing mass ${n \choose k}p^kq^{n-k}$ at $k$ for each $k = 0, ..., n$.
How is this implied?
We have $$\sum_{k=0}^n {n \choose k} p^k q^{n-k} z^k =\mathbb E(z^{S_n})=\sum_{k=0}^n E(z^{S_n}{\bf 1}_{\{S_n=k\}}) =\sum_{k=0}^n \mathbb P(S_n =k) z^k \,.$$
Now if two polynomials have the same value for all $z$, then their coefficients must agree as well (Just consider the difference of the two polynomials).