Variance of Negative Binomial Distribution (without Moment Generating Series)

20.5k Views Asked by At

Given the discrete probability distribution for the negative binomial distribution in the form

$$P(X = r) = \sum_{n\geq r} {n-1\choose r-1} (1-p)^{n-r}p^r$$

It appears there are no derivations on the entire www of the variance formula $V(X) = \frac{r(1-p)}{p^2}$ that do not make use of the moment generating function.

I have successfully managed to compute the mean without this as follows;

\begin{align*} \mu = \sum_{n\geq r} n{n-1\choose r-1} (1-p)^{n-r}p^r &= \sum_{n\geq r} \frac{n(n-1)!}{(r-1)!(n-r)!}(1-p)^{n-r}p^r \\ &= \frac{r}{p} \sum_{n\geq r} \frac{n!}{r!(n-r)!}(1-p)^{n-r} p^{r+1}\\ \end{align*} Having already factored our claimed mean of $r/p$, it remains to show that $\sum_{n\geq r} \frac{n!}{r!(n-r)!}(1-p)^{n-r} p^{r+1} = 1$ which is done by reindexing (both $r$ and $n$) and realizing this as the sum of a probability mass function for a negative binomial distribution. Indeed, letting $k = r+1$ followed by $m = n+1$, we find

\begin{align*} \sum_{n\geq r} \frac{n!}{r!(n-r)!}(1-p)^{n-r} p^{r+1} &= \sum_{n\geq k-1}\frac{n!}{(k-1)!(n-k+1)!}(1-p)^{n-k+1}p^k\\ &= \sum_{m\geq k}\frac{(m-1)!}{(k-1)!(m-k)!}(1-p)^{m-k}p^k\\ &= \sum_{m\geq k}{m-1\choose k-1}(1-p)^{m-k}p^k = 1 \end{align*}

Does anyone know of a way to demonstrate that $\sigma^2 = V(X) = \frac{r(1-p)}{p^2}$ in this fashion?

2

There are 2 best solutions below

0
On

This is too long for a comment, so I have it here as an answer.

Funny you ask this, since I was trying to figure this out yesterday. To prove that the Negative Binomial PDF does sum over $\mathbb{Z}_{\geq 0}$ to give $1$, you will need to make use of the binomial theorem for negative exponents (as Alex has indicated) and the fact posted at Negative binomial coefficient (but note the way this is written is for the "other" negative binomial distribution, with $K = X-r$).

The second moment $\mathbb{E}[X^2]$ is a bit tedious to compute. You will need to do reindexing twice rather than once with $\mathbb{E}[X]$. Unfortunately, the form of your negative binomial PDF is different from the one I worked with ($K = X-r$, as indicated above), so I don't have a sketch of this.

0
On

$ \begin{split} E(X^2)&=\sum \limits_{n=r}^\infty n^2\tbinom{n-1}{r-1}p^rq^{n-r}\\ &=\sum \limits_{t=0}^\infty (r+t)^2\tbinom{r+t-1}{r-1}p^rq^{t} \qquad (let \ n-r=t)\\ &=p^r\sum \limits_{t=0}^\infty (r+t)\frac{(r+t)!}{(r-1)!t!}q^{t}\\ &=p^r\sum \limits_{t=0}^\infty r\frac{(r+t)!}{(r-1)!t!}q^{t}+p^r\sum \limits_{t=0}^\infty t\frac{(r+t)!}{(r-1)!t!}q^{t}\\ &=r^2p^r\sum \limits_{t=0}^\infty \frac{(r+t)!}{r!t!}q^{t}+rp^r\sum \limits_{t=0}^\infty \frac{(r+t)!}{r!t!}tq^{t}\\ &=r^2p^rp^{-r-1} + rp^r\sum \limits_{t=0}^\infty \tbinom{-r-1}{t}t(-q)^{t}\\ &=\frac{r^2}{p}+rp^rq\sum \limits_{t=0}^\infty \tbinom{-r-1}{t}(-t)(-q)^{t-1}\\ &=\frac{r^2}{p}+rp^rq\sum \limits_{t=0}^\infty \tbinom{-r-1}{t}\frac{d(-q)^t}{dq}\\ &=\frac{r^2}{p}+rp^rq\frac{d}{dq}\sum \limits_{t=0}^\infty \tbinom{-r-1}{t}(-q)^t\\ &=\frac{r^2}{p}+rp^rq\frac{d(1-q)^{-r-1}}{dq}\\ &=\frac{r^2}{p}+\frac{(r+1)rq}{p^2}\\ Var(X)&=E(X^2)-[E(X)]^2\\ &=\frac{r^2}{p}+\frac{(r+1)rq}{p^2}-\frac{r^2}{p^2}\\ &=\frac{r(1-p)}{p^2} \end{split} $