Sum of Negative Binomial distributed r.v.

3.8k Views Asked by At

what is the distribution of the sum of two negative binomial distributed r.v.`s $R_1$ and $R_2$, i.e. $$ P(R_i=k) = \binom{\alpha_i + k-q}{k} p^{\alpha_i} (1-p)^k \text{ for } \alpha_i > 0 $$

Is it best to show it with the convolution of the two r.v`s?

2

There are 2 best solutions below

0
On BEST ANSWER

If $R_1 \sim NB(a_1,p)$ is the number of successes before first seeing the $a_1$th failure and independently $R_2 \sim NB(a_2,p)$ with the same $p$, then by definition $$R_1+R_2 \sim NB(a_1+a_2,p)$$ is the number of successes before first seeing the $(a_1+a_2)$th failure.

Alternatively, see $R_1$ as the sum of $a_1$ independent geometric random variables with parameter $p$, and $R_2$ as the sum of $a_2$ independent geometric random variables with parameter $p$, so $R_1+R_2$ is the sum of $a_1+a_2$ independent geometric random variables with parameter $p$

1
On

A computational approach is to recall that the MGF of the sum of independent random variables is equal to the product of the MGFs of each random variable in the sum. In this case, if $$X_i \sim \operatorname{NegBinomial}(r_i, p), \quad i = 1, 2, \ldots, n,$$ are independent negative binomial random variables with common Bernoulli probability parameter $p$, for the parametrization $$\Pr[X_i = x] = \binom{r_i + x - 1}{r_i - 1} p^{r_i} (1-p)^x, \quad x \in \{0, 1, 2, \ldots\},$$ we have $$M_{X_i}(t) = \operatorname{E}[e^{tX_i}] = \left(\frac{p}{1-e^t(1-p)}\right)^{r_i}.$$ It immediately follows that for $S_n = \sum_{i=1}^n X_i$ we have $$M_{S_n}(t) = \prod_{i=1}^n M_{X_i}(t) = \prod_{i=1}^n \left(\frac{p}{1-e^t(1-p)}\right)^{r_i} = \left(\frac{p}{1-e^t(1-p)}\right)^{\sum_{i=1}^n r_i},$$ which shows that $$S_n \sim \operatorname{NegBinomial}(r_1 + \cdots + r_n, p).$$ This of course is the same idea as showing that the sum of IID geometric random variables is negative binomial.