Computing the expectation and using the derivation trick

146 Views Asked by At

I am reading Casella & Berger's Statistical Inference and trying to solve some of the exercises. Looking at the correction of exercise 2.20, the author used a method I don't understand, even introducing a derivative in the infinite sum. Can you please explain the steps that led to $\frac1p$?


$$\mathbb{E}X = \sum_{k=1}^\infty k(1-p)^{k-1}p = p - \sum_{k=1}^\infty\frac{d}{dp}(1-p)^k = -p\frac{d}{dp}\left[\sum_{k=0}^\infty\left(1-p\right)^k-1\right]=-p\frac{d}{dp}\left[\frac{1}{p}-1\right]=\frac{1}{p}$$


Wikipedia and other websites have similar proofs, usually for deriving the expectation of a geometric distribution.

1

There are 1 best solutions below

1
On BEST ANSWER

The author slightly overcomplicated this by separating out the first term of the sum.

By the power rule and chain rule, $\frac{d}{dp}(1-p)^n=-n(1-p)^{n-1}$. Also, if $S=\sum_{k=1}^\infty r^k$, then $rS=S-r$, so $S=\frac r{1-r}$. Therefore:

$\begin{align} \Bbb EX &=\sum_{k=1}^\infty k(1-p)^{k-1}p \\ &= -p\sum_{k=1}^\infty(-k(1-p)^{k-1}) \\ &= -p\sum_{k=1}^\infty\frac{d}{dp}(1-p)^k \\ &= -p\frac{d}{dp}\sum_{k=1}^\infty(1-p)^k \\ &= -p\frac d{dp}\frac{1-p}{1-(1-p)} \\ &= -p\frac d{dp}\left(p^{-1}-1\right) \\ &= -p(-p^{-2}) \\ &= \frac1p. \end{align} $