I am trying to know how to estimate a parameter with the moments method. The wikipedia article and similar websites are too confusing and formal for me to understand. I'm looking for a more basic and school type "how-to".
For example ; Let $Z_{1},Z_{2},...,Z_{n}$ a simple sample of an Erlang random variable whose function of probability density is given by: $$ f(z)\begin{cases}\lambda^{2}ze^{-\lambda z} & z \ge 0\\0 & else\end{cases}$$
How do I estimate $\lambda$ with the moments method ? How can I generalize the method to any (maybe not extra-hard) problem of the same type ?
EDIT : I am now trying to solve the problem and so far; $$E[X] = n/\lambda$$ $$\widehat{E[X]} = \int_{0}^{\infty} \lambda z^2 e^{-\lambda z} dz = \frac{2}{\lambda^2}$$
And now I'm stuck, what to do with the second moment ? Or do I equal both the theorical and empirical, giving $\lambda = 2/n$ ?
A very simple guide:
1) Determine the amount of parameters you need to estimate.
2) Express the moments (eg. the mean and the second moment) through the parameters you want to esitmate, those can for instance be found on wikipedia.
3) Calculate the corresponding empirical moments from your sample.
4) Now you have (2) equations that can be solved for the parameters.
I hope this helps.
Edit:
The empirical first moment is simply the sample mean. This applies for any of the moments: $\frac{1}{n}\sum_{i=1}^n X_i^k$ is the $k$'th empirical moment. The expection of $Z$, given it follows the Erlang distribution, is given by $\frac{2}{\lambda}$. As this type of Erlang distribution has only one parameter, $\hat{\lambda}=\frac{2}{\frac{1}{n}\sum_{i=1}^n X_i}$.