Show an estimator for $f(\theta)=e^{-\theta}$ is consistent

103 Views Asked by At

Let $X_1,...,X_n$i.i.d.~$Pois(\theta)$ with unknown $\theta>0$ and $n\geq2$. Is $\hat{\gamma}=(\frac{n-1}{n})^T$, where $T=\Sigma_{i=1}^{n}X_i$, the UMVUE for $f(\theta)=e^{-\theta}$? How can I show it is consistent?

To show the estimator is consistent, I want to show that $\mathbf{E}_\theta[(\hat{\gamma}-f(\theta))^2]=0$ as n goes to infinity. However I am having a hard time evaluating this expectation.

2

There are 2 best solutions below

0
On

As you suggest, you can show the mean squared error goes to zero, which would be sufficient for consistency. Alternatively, you can use WLLN and CMT:

Define $a_n=\left(1- {1\over n}\right)^{n},a=1/e,b_n={\frac{1}{n}\sum_{i=1}^n X_i},b=\theta.$

Note that $$\hat\gamma={a_n}^{b_n}.$$

WLLN tells us $b_n$ converges in probability to $b$. Meanwhile, $a_n$ converges to $a$. So CMT tells us that $\hat \gamma$ converges in probability to $a^b=e^{-\theta}$ and hence is consistent for $f(\theta).$

0
On

Let $T=T_n=X_1+\dots+X_n$; then by the Weak Law of Large numbers, $S_n/n$ converges in probability to $\mathbb{E}(X_i)=\theta$, i.e., $$ \mathbb{P}(A_{n,\epsilon}^c)\to 0\quad\text{as }n\to\infty. $$ where $$ A_{n,\epsilon}=\left\{\left|\frac{T_n}n-\theta\right|\le \epsilon\right\}. $$ On the other hand, on the event $A_{n,\epsilon}$ we have $$ \hat\gamma=\left(1-\frac1n\right)^{T_n}\in \left[\left(1-\frac1n\right)^{n(\theta+\epsilon)} \left(1-\frac1n\right)^{n(\theta-\epsilon)}\right] \to \left[e^{-\theta-\epsilon},\ e^{-\theta+\epsilon}\right] \in\left[e^{-\theta}(1-\epsilon),e^{-\theta}(1+1.5\epsilon)\right] \quad\text{as }n\to\infty. $$ since $e^{\epsilon}<1+1.5\epsilon$ for $\epsilon$ small enough. Hence for all sufficiently large $n$ we would have $$ \hat\gamma= \in\left[e^{-\theta}-2\epsilon,e^{-\theta}+2\epsilon\right] $$ and thus $$ \mathbb{P}(|\hat\gamma-e^{-\theta}|>2\epsilon)\to 0. $$ Since $\epsilon>0$ s arbitrary, we're done.