Prove whether the M.o.M. estimator is consistent or not

785 Views Asked by At

Let $X_{1}, X_{2}, \cdots, X_{k}$ to be random samples from $Binomial(n, p)$.

And denote k'th moment of distribution as $\mu_{k}$.

Then the first and second moments are:

\begin{align} \mu_{1} = np, \quad \mu_{2} = np(1-p) + n^{2}p^{2} \end{align}

So we can get,

\begin{align} p = 1 - \frac{\mu_{2}-\mu_{1}^{2}}{\mu_{1}}, \quad n = \frac{\mu_{1}}{p} = \frac{\mu_{1}^{2}}{\mu_{1}-\mu_{2}+\mu_{1}^{2}} \end{align}

And now I use the method of moments to estimate $n$ and $p$.

\begin{align} \hat{p} = \frac{\bar{X} - \frac{1}{k}\sum{(X_{i}-\bar{X})^{2}}}{\bar{X}}, \quad \hat{n} = \frac{\bar{X}^{2}}{\bar{X} - \frac{1}{k}\sum{(X_{i}-\bar{X})^{2}}} \end{align}

What I wonder is whether $\hat{p}$ and $\hat{n}$ are consistent estimators.

My poor assumption is - I refered to Slutsky's theorem(https://en.wikipedia.org/wiki/Slutsky%27s_theorem) -

$\bar{X}$ is the consistent estimator for $\mu$ and $\frac{1}{k}\sum{(X_{i}-\bar{X})^{2}}$ is the consistent estimator for $\sigma^{2}$

so by using Slutsky's theorem, $\bar{X} - \frac{1}{k}\sum{(X_{i}-\bar{X})^{2}}$ is going to converge to $\mu-\sigma^{2}$ as $k$ goes to infinity.

Since both numerator and denominator converges, $\hat{p}$ is going to converge at $\frac{\mu-\sigma^{2}}{\mu}$ and $\hat{n}$ is going to converge at $\frac{\mu^{2}}{\mu-\sigma^{2}}$ as $k$ goes infinity.

And I concluded that they are consistent estimator as those values are equivalent to $p$ and $n$.

Is it correct or not? I'm quite new to mathematical statistics so I might wrong while developing my logic, especially at the procedure using Slutsky's theorem.

1

There are 1 best solutions below

1
On

In other words, assuming your calculations as good, you get

$$\hat{p}=1-\frac{\frac{1}{n}\Sigma_i X_i^2}{\overline{X}}+\overline{X}$$

Thus using the Strong Law of Large Numbers (SLLN) and the continuous mapping theorem you get

$$\hat{p}\xrightarrow{\text{a.s.}}1-\frac{np(1-p)+n^2p^2}{np}+np=p$$

and similarly for the other estimator

$$\hat{n}=\frac{(\overline{X})^2}{\overline{X}-\frac{1}{n}\Sigma_i X_i^2+(\overline{X})^2}\xrightarrow{\text{a.s.}}\frac{n^2p^2}{np-np(1-p)-n^2p^2+n^2p^2}=n$$