I had to prove this theorem.
If $X\text{~Bin}(n, p)$ and $Y\text{~Bin}(m, p)$ are independent random variables then $$Z=(X+Y)\text{~Bin}(n + m, p)$$
My proof: If we consider $Z = X + Y$, then $E(Z) = E(X + Y) \implies E(Z) = E(X) + E(Y)$ since $X$ and $Y$ are independent. We know that for a binomial distribution, the mean is the product of the two parameters. $$\therefore E(X) = n\cdot p$$$$\text{and, } E(Y) = m\cdot p$$ Then we will have, $$E(Z) = n\cdot p + m\cdot p$$ $$\implies E(Z) = (n + m)\cdot p$$ This corresponds to a binomial distribution where the number of trials is $(n + m)$ and the probability of success is $p$. From this, we can conclude $Z\text{~Bin}(n + m, p)$.
Is this proof correct? Are there alternate ways to prove this?
You may also apply the Law of Total Probability and the independence of $X$ and $Y$: \begin{align*} \mathbb{P}(Z = z) & = \mathbb{P}(X + Y = z)\\ & = \sum_{x = 0}^{\infty}\mathbb{P}(X = x, Y = z - x)\\ & = \sum_{x = 0}^{\infty}\mathbb{P}(X = x)\mathbb{P}(Y = z - x)\\ & = \sum_{x = 0}^{z}{n\choose x}p^{x}(1 - p)^{n - x}{m\choose z - x}p^{z - x}(1 - p)^{m - z + x}\\ & = \sum_{x = 0}^{z}{n\choose x}{m\choose z - x}p^{z}(1 - p)^{m + n - z}\\ & = {n + m\choose z}p^{z}(1 - p)^{m + n - z} \end{align*} and we are done.
Hopefully this helps!