Consistence of the estimators of a zero inflated poisson

124 Views Asked by At

I'm doing this exercise from my Inferential Statistics notes:

The number of cell defects observed in a person exposed to radiation follows a distribution (or a mixture of a distribution) of $X$ $\sim$ Poisson($\lambda$) and a number of zeros (in a healthy person, zero defects are observed). Suppose that $Y$ is the variable that can be $X$ with probability $\epsilon$ and the constant $0$ with probability $(1 - \epsilon)$.

· Find the probability function of $Y$.

· Find the $\epsilon$ and $\lambda$ estimators with the method of the moments.

· Are they consistent estimators?

I've solved the first and second points. But I can't prove the third point.

The estimators $\epsilon_{e}$ and $\lambda_{e}$ are:

$$\epsilon_{e}=\frac{M_1^{2}}{M_2-M_1},$$

and

$$\lambda_{e}=\frac{M_2\cdot M_1-M_1^{2}}{M_1^{2}},$$

where $M_1$ and $M_2$ are the first and the second moment of $Y$.

How can we prove, from this expressions, that this estimators are consistent?

Thanks for your time.

1

There are 1 best solutions below

0
On BEST ANSWER

To constuct estimators, you should have a sample from the distribution of $Y$. Say, $y_1,\ldots, y_n$ are independent random variables with the same distribution as $Y$. Your estimators of unknown parameters should depend on the samples. Method of moments requires that you find those parameter values that equate theoretical and empirical moments. So, these estimators are the solutions (w.r.t. $\lambda_e$ and $\epsilon_e$) of the following system of equations: $$ \color{red}{\epsilon_e\lambda_e= \bar y} = \dfrac1n\sum_{i=1}^n y_i, \quad \color{red}{\epsilon_e(\lambda_e+\lambda_e^2)=\overline {y^2}} = \dfrac1n\sum_{i=1}^n y_i^2. $$ Here $M_1=\epsilon\lambda$ and $M_2=\epsilon(\lambda+\lambda^2)$ and we equate them to sample moments.

So, the estimators are $$ \epsilon_{e}=\frac{\left(\overline y\right)^{2}}{\overline {y^2}-\overline y}, \quad \lambda_{e}=\frac{\overline {y^2}\cdot \overline y-\left(\overline y\right)^{2}}{\left(\overline y\right)^{2}}=\frac{\overline {y^2}}{\overline y}-1. $$ By Law of Large Numbers, sample moments converge in probability to theoretical moments: $$\overline y \xrightarrow{p} M_1 \text{ as } n\to\infty$$ and $$\overline{y^2} \xrightarrow{p} M_2 \text{ as } n\to\infty.$$ And the sum, difference, product and so on of convergent sequences also converges. Then $$ \epsilon_{e}=\frac{\left(\overline y\right)^{2}}{\overline {y^2}-\overline y} \xrightarrow{p} \frac{M_1^{2}}{M_2-M_1} = \epsilon $$ $$\lambda_{e}=\frac{\overline {y^2}\cdot \overline y-\left(\overline y\right)^{2}}{\left(\overline y\right)^{2}}=\frac{\overline {y^2}}{\overline y}-1 \xrightarrow{p} \frac{M_2}{M_1}-1=\lambda. $$ Both estimators are consistent.