mean and variance formula for negative binomial distribution

1.5k Views Asked by At

The equation below indicates expected value of negative binomial distribution. I need a derivation for this formula. I have searched a lot but can't find any solution. Thanks for helping :)

$$ E(X)=\sum_{x=r}^\infty x\cdot \binom {x-1}{r-1} \cdot p^r \cdot (1-p)^{x-r} =\frac{r}{p} $$

I have tried: \begin{align} E(X) & =\sum _{x=r} x\cdot \binom{x-1}{r-1} \cdot p^r \cdot (1-p)^{x-r} \\[8pt] & = \sum_{x=r}^\infty x \cdot \frac{(x-1)!}{(r-1)! \cdot ((x-1-(r-1))!} \cdot p^r \cdot (1-p)^{x-r} \\[8pt] & = \sum_{x=r}^\infty \frac{x!}{(r-1)!\cdot ((x-r)!} \cdot p^r \cdot (1-p)^{x-r} \\[8pt] \Longrightarrow & \phantom{={}} \sum_{x=r}^\infty r\cdot \frac{x!}{r!\cdot (x-r)!}\cdot p^r \cdot (1-p{)}^{x-r} \\[8pt] & = \frac{r}{p} \cdot \sum_{x=r}^\infty \frac{x!}{r!\cdot (x-r)!}\cdot p^{r+1}\cdot (1-p)^{x-r} \end{align}

If the power of $p$ in the last equation were not $r + 1,$ I can implement Newton Binomial. So It will be true. But I am stuck here.

2

There are 2 best solutions below

3
On BEST ANSWER

If you want to continue that derivation instead of using linearity of expectation on a sum of i.i.d. geometric random variables, then you can follow this; however, doing it this way is much more complicated than the method using the i.i.d. variables.

When you arrive at the step $\operatorname{E}(X) = \sum_{x\geq r} r \binom{x}{r} p^r (1 - p)^{x - r}$, we can use this fact about power series:

$$ \frac{1}{(1 - z)^{r + 1}} = \sum_{n\geq r} \binom{n}{r}z^{n-r}, \quad \text{for }\lvert z\rvert < 1. $$

If this fact is unfamiliar to you, then you can derive it from the geometric series $\frac{1}{1 - z} = \sum_{n\geq 0} z^n$ by differentiating both sides $r$ times and dividing by $r!$. Of course, we are tacitly assuming that $p \neq 0$ in order to use this. Otherwise, the event that we want to occur $r$ times could not occur at all!

It follows that

$$\begin{align*} \operatorname{E}(X) &= r p^r\sum_{x\geq r} \binom{x}{r} (1 - p)^{x - r} \\ &= rp^r \cdot \frac{1}{\big(1 - (1 - p)\big)^{r + 1}} \\ &= rp^r \cdot \frac{1}{ p^{r + 1}} \\ &= \frac{r}{p} \end{align*}$$

We can do something similar for the variance using the formula

$$\begin{align*} \operatorname{Var} X &= \operatorname{E}\big(X^2\big) - \big(\operatorname{E}(X)\big)^2 \\ &= \operatorname{E}\big(X(X + 1)\big) - \operatorname{E}(X) - \big(\operatorname{E}(X)\big)^2. \end{align*}$$

This means that

$$\begin{align*} \operatorname{Var} X &= \sum_{x\geq r} x (x + 1)\binom{x - 1}{r - 1} p^r (1 - p)^{x - r} - \frac{r}{p} - \frac{r^2}{p^2} \\ &= \sum_{x\geq r} r (r + 1)\binom{x + 1}{r + 1} p^r (1 - p)^{x - r} - \frac{r p + r^2}{p^2} \\ &= r(r + 1)p^r \sum_{x\geq r+1} \binom{x}{r + 1} (1 - p)^{x - (r + 1)} -\frac{r p + r^2}{p^2} \\ &= r(r + 1)p^r \cdot \frac{1}{\big(1 - (1 - p)\big)^{r + 2}} -\frac{r p + r^2}{p^2} \\ &= \frac{r^2 + r}{p^2} - \frac{rp + r^2}{p^2} \\ &= \frac{r (1 - p)}{p^2}. \end{align*}$$

0
On

Here is an alternative derivation to compute the expectation of a negative binomial distribution. Let $X_r$ be the number of trials needed to get $r$ successes, and $p$ be the probability of success on any given trial. If $r=1$, then $X_r$ has a geometric distribution, and so $\mathrm{E}(X_r)=1/p$. To prove this, let $S$ be the event that the first trial is success. By the law of iterated expectation, \begin{align} \DeclareMathOperator{\E}{\mathrm{E}} \DeclareMathOperator{\P}{\mathrm{P}} \E(X_1) &= \E(X_1 \mid S)\P(S)+\E(X_1\mid S')P(S') \\[4pt] &= (1 \cdot p)+(\E(X_1)+1)(1-p) \, , \end{align} and rearranging yields the desired result. Computing $\E(X_r)$ in general is not too difficult if we use linearity of expectation. Let $A_i$ be the number of additional trials needed to get the $i$-th success once you have had $i-1$ successes. Note that $X_r=\sum_{i=1}^{r}A_i$, and so $$ \E(X_r)=\E\left(\sum_{i=1}^r A_i\right)=\sum_{i=1}^r \E(A_i) \, . $$ But the expected number of trials needed to get the $i$-th success is no different to the expected number of trials needed to get the first success, and so $\E(A_i)=\E(A_1)=\E(X_1)=1/p$. Hence, $E(X_r)=r/p$.