Alternative approaches to obtain the expected value of the geometric distribution

178 Views Asked by At

Given that $X$ has geometric distribution with $p_{X}(x) = p(1-p)^{x-1}$, determine $\textbf{E}(X)$.

MY ATTEMPT

\begin{align*} \textbf{E}(X) = \sum_{x=1}^{\infty}xp(1-p)^{x-1} = p\sum_{x=1}^{\infty}x(1-p)^{x-1} \end{align*}

If we denote by

\begin{align*} F(w) = \sum_{k=1}^{\infty} w^{k} = \frac{w}{1 - w}\quad\text{for}\quad |w| < 1 \end{align*}

We conclude that \begin{align*} \textbf{E}(X) = p\sum_{x=1}^{\infty}x(1-p)^{x-1} = pF^{\prime}(1-p) \end{align*}

Since $\displaystyle F^{\prime}(w) = \frac{1}{(1-w)^{2}}$, it is now possible to obtain the desired result \begin{align*} \textbf{E}(X) = \frac{p}{(1-(1-p))^{2}} = \frac{1}{p} \end{align*}

In the case that my answer is correct, could someone provide me any other approach to this problem? I'd prefer solutions which do not involve sophisticated methods. Thanks in advance.

3

There are 3 best solutions below

1
On BEST ANSWER

The geometric distribution gives the number of trials until the first success (including the successful one, in your mass function above) in a sequence of trials with probability of success $p$.

The first trial is either a success (probability $p$) or a failure (probability $1-p$); if it is a success, you are done and $X=1$. If it is a failure, you are left with another geometric process which you must add one extra failure to.

In other words, $$ \mathbb{E}[X]=p\cdot1+(1-p)\cdot(1+\mathbb{E}[X])=1+(1-p)\mathbb{E}[X]. $$ Subtracting $(1-p)\mathbb{E}[X]$ from both sides yields $$ \mathbb{E}[X]-(1-p)\mathbb{E}[X]=1, $$ which simplifies to $$ p\mathbb{E}[X]=1\qquad\Rightarrow\qquad\mathbb{E}[X]=\frac{1}{p}. $$

0
On

Your result is correct. You can also use a version of Fubini-Tonelli's theorem (i.e. changing order of summation): $$\begin{eqnarray} \sum_{j=0}^\infty j(1-p)^{j-1}&=&\sum_{j=0}^\infty \left(\sum_{k=0}^{j-1} 1\right)(1-p)^{j-1}\\&=&\sum_{k=0}^\infty \left(\sum_{j=k+1}^\infty(1-p)^{j-1} \right)\\ &=&\sum_{k=0}^\infty \frac{(1-p)^k}{p}\\ &=&\frac{1}{p^2}. \end{eqnarray}$$ This gives $$ \Bbb E[X]=p\sum_{j=0}^\infty j(1-p)^{j-1}=\frac{1}{p}. $$
Note: Note that $$\binom{-2}{j}=\frac{(-2)(-3)\cdots(-1-j)}{j!}=(-1)^j(j+1).$$ Generalized binomial theorem also gives $$ \sum_{j=0}^\infty (j+1)x^j=\sum_{j=0}^\infty \binom{-2}{j}(-x)^j=(1-x)^{-2}. $$

0
On

I can offer a slightly different way of doing this $$ E(X) = p \sum_{x=1}^{\infty} x(1-p)^{x-1} $$ $$ (1-p)E(X) =p\sum_{x=1}^{\infty} x(1-p)^{x} $$ $$ E(X) - (1-p)E(X) = p\sum_{x=0}^{\infty} (1-p)^{x} $$ You can see this writing down the first few terms in each sum : $$ E(X) = p( 1 + 2(1-p) + 3(1-p)^2 ...)$$ $$ (1-p)E(X) = p( (1-p)+ 2(1-p)^2 + 3(1-p)^3 ...) $$ $$ E(X) - (1-p)E(X) =p (1+ (1-p) + (1-p)^2 + (1-p)^3 ...) $$ $$ E(X) - (1-p)E(X) = p \frac{1}{1-(1-p)} $$ $$ E(X) =\frac{1}{p}$$