Convergence of the expected value of the geometric distribution

182 Views Asked by At

Let $X \in \mathbb{N}$ be a random variable with geometric distribution with parameter $0<p<1$, i.e. $$P(X=x) ~=~ (1-p)^{x-1}p.$$

A well-known result is the fact that the expected value of $X$ is $E(X)~=~1/p$.

I've seen many ways to prove this, but all of them (implicitly) assume that the following power series converges for $0<r<1$:

$$\sum_{n=1}^\infty n\, r^n.$$

Question: how do I prove that this series is convergent? (I am aware that the geometric series $\sum_{n=0}^\infty r^n$ converges for such $r$.)

2

There are 2 best solutions below

2
On BEST ANSWER

The usual way to prove that $\sum nr^n$ converges is to apply the ratio test: If $0<r<1$ then $$\lim _{n\to\infty}\left|\frac{(n+1)r^{n+1}}{nr^n}\right| =r $$ and $r$ is less than one, so the series $\sum nr^n$ converges absolutely.

Another approach is to observe that the series has positive terms, so we can rearrange it freely. If you agree that $\sum_{n=a}^\infty r^n=\displaystyle\frac {r^a}{1-r}$, you can even evaluate the series directly: $$\sum_{n=1}^\infty nr^n = \sum_{n=1}^\infty\sum_{k=1}^n r^n \stackrel{(1)}=\sum_{k=1}^\infty\sum_{n=k}^\infty r^n =\sum_{k=1}^\infty\frac{r^k}{1-r} = \frac1{1-r}\sum_{k=1}^ \infty r^k= \frac1{1-r}\frac r{1-r}= \frac r{(1-r)^2} $$ In step (1) we interchange the order of summation.

0
On

$R=\frac{1}{\limsup_n \sqrt[n]{n}}=1$

So the power series converges for every $|r|<1$ and thus every $r \in (0,1)$