Method of Moments - What's the logic?

768 Views Asked by At

Method of Moments - What's the logic? We have a random vector $X=(X_1,X_2,...,X_n)$ that generates a sample. The hypothesis is that our components of the random vector $X_i$ are i.i.d.

The method of moments, for point estimation, in statistical inference, suggests a way to find an estimate of a parameter $\theta$. If you need to estimate only one parameter, the procedure seems to force you to equate the first theoretical moment $E(X_i)$ to the corresponding sample moment.

Question:

If I have only one parameter to estimate, can I equate the second theoretical moment to the second sample moment in order to find $\theta$? Does the fact that I have only one parameter to estimate force me to use just the first moments in my equation? Am I free to use every moment in my equation?

Thanks in advance.

1

There are 1 best solutions below

3
On BEST ANSWER

You don't even have to use moments. Any function $g(X)$ whose expectation with respect to $P_{\theta}$ uniquely determines $\theta$ will do. By Law of Large Numbers, the average of $g(X_i)$ will converge to that expectation, and you estimate $\theta$ to be the $\theta$ giving this expectation (this only works if the inverse of $\mathbb{E}_{\theta} g(X)$ is continuous in $\theta$). One often uses $g(X)=X^k$, in which case this gives estimates of the k-th mooment of the underlying distribution (from which one deduces the parameters), hence the name.

This answer is shamelessly lifted from https://ocw.mit.edu/courses/18-443-statistics-for-applications-fall-2003/resources/lec3/ where you can find more details.