I am stuck on the following problem. I believe that my solution is right so far,
but I do not know how to finish the problem. Ideally, I would like to do this
problem without using moment generating functions or the idea of the convolution. Maybe that is not a realistic goal.
Thanks,
Bob
Problem:
Let $X$ and $Y$ be independent binomial r.v.'s with parameters $(n,p)$ and $(m,p)$,
respectively. Let $Z = X + Y$. What is the distribution of $Z$?
Answer:
\begin{eqnarray*}
P(Z = k) &=& \sum_{i = 0}^{k} P(X = i)P(Y = k-i) \\
P(Z = k) &=&
\sum_{i = 0}^{k} {n \choose i}p^i(1-p)^{n-i}
{m \choose {k-i} } p^{k-i}(1- p)^{m -(k-i)} \\
P(Z = k) &=&
\sum_{i = 0}^{k} {n \choose i}p^k(1-p)^{n-i}
{m \choose {k-i} } (1- p)^{m -k+i} \\
P(Z = k) &=&
\sum_{i = 0}^{k} {n \choose i}p^k(1-p)^{n+m-k}
{m \choose {k-i} } \\
\end{eqnarray*}
2026-04-03 17:28:46.1775237326
On
The Sum of Two Binomial variables
2.1k Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail At
2
There are 2 best solutions below
0
On
You can go on with:$$=p^k(1-p)^{n+m-k}\sum_{i=0}^k\binom{n}{i}\binom{m}{k-i}=p^k(1-p)^{n+m-k}\binom{n+m}k$$
Btw, you can also deduce more directly that the sum of two independent binomials with equal parameter $p$ is binomial again.
This based on the fact that a binomial is actually a sum of iid Bernoulli- distributed random variables.
See here for that.
$\sum_{i=0}^{k}\binom n i \binom m {k-i} = \binom {n+m} k$ is a standard equality, use it to finish the proof