Suppose $X_1, X_2, \ldots, X_n$ constitute a random sample drawn from a population which has a probability function given by $$\Pr[X = x] = \frac{1}{\mu} \left( 1 - \frac{1}{\mu} \right)^{x-1}, \quad x = 1, 2, \ldots,$$ where $\mu$ is a constant $\ge 1$.
Find the estimator of $\mu$ by the method of moments.
Its a tutorial question. I tried by multiplying $x$ by the probability function as well trying the MGF, but got to dead ends. Help anyone thanks.
We will prove the following lemma: for $|z| < 1$, we have $$\sum_{k=1}^\infty k z^{k-1} = (1-z)^{-2}.$$ One simple proof is to recall that $$\sum_{k=0}^\infty z^k = \frac{1}{1-z},$$ and by differentiation of formal power series, we then have $$\frac{d}{dz}\left[\sum_{k=0}^\infty z^k \right] = \sum_{k=0}^\infty \frac{d}{dz}\left[z^k\right] = \sum_{k=1}^\infty kz^{k-1} = \frac{d}{dz}\left[\frac{1}{1-z}\right] = (1-z)^{-2}.$$
Now, the method of moments involves equating the sample raw moment with the corresponding raw moment of the distribution; i.e., for $k = 1, 2, \ldots$, solve the system $${\rm E}[X^k] = \frac{1}{n} \sum_{i=1}^n X_i$$ where we use as many $k$ as needed to uniquely determine the parameters. Since there is only one parameter $\mu$, we compute the first moment: $$\begin{align*} {\rm E}[X] &= \sum_{x=1}^\infty x \Pr[X = x] = \sum_{x=1}^\infty \frac{x}{\mu}\left(1 - \frac{1}{\mu}\right)^{\!x-1} \\ &= \frac{1}{\mu}\left(1 - (1 - 1/\mu)\right)^{-2} \\ &= \mu, \end{align*}$$ by the earlier lemma. Therefore, we simply have $$\tilde \mu = \bar X,$$ where $\tilde \mu$ is our method of moments estimator of the parameter $\mu$, and $\bar X = \frac{1}{n} \sum_{i=1}^n X_i$ is the sample mean.