Finding UMVU stimator of $\frac{\lambda^3}{3!}e^{-\lambda}$ function for Poisson distribution $(Poiss(\lambda))$ sample

796 Views Asked by At

Let's say that we want to find Unbiased Minimum Variance estimator of $\lambda^2$ for a sample $X=(X_1,\dots,X_n)$ from Poisson distribution. We can consider $\overline{X}^2$ and notice that

$$\mathbb{E}(\overline{X}^2) = Var(\overline{X}) + \left(\mathbb{E}(\overline{X})\right)^2 = \frac{1}{n^2} \cdot n \cdot Var(X_1) + \lambda^2 = \lambda^2 + \frac{\lambda}{n} = \lambda^2 + \mathbb{E}\left(\frac{1}{n}\overline{X} \right)$$

Which gives us that $\overline{X}^2 + \frac{1}{n}\overline{X}$ is what we are searching for: because it's a function of $\overline{X}$, which for exponential families is sufficient, and in this case also complete, statistic, then from Lehmann-Scheffe theorem we conclude that it's UMVU estimator.


Now, let's consider a function like: $$g(\lambda) = \frac{\lambda^3}{3!}e^{-\lambda}.$$

Which is actually a probability $P(X_i=3)$. What can we do with this, to find an UMVU estimator of this one? I was wondering to start, just like above, from considering

$$T(X)= \frac{\overline{X}^3}{3!}e^{-\overline{X}},$$

and eventually adjust it with some other components, but trying to find expected value of this almost made me cry. I was thinking about transforming it somehow and using Basu theorem make things easier, but not sure how.

Any advice how to do this? Or maybe I can find the UMVUE easier here?

2

There are 2 best solutions below

8
On BEST ANSWER

Start from $$ g_n = \mathcal{I}\{X_1 = 3\}, $$ as an unbiased estimator. Then, using Rao-Blackwell, compute $$ g_n^{RB} = \mathbb{E}[g_n|\sum _{i=1}^n X_i =t]. $$ Note that $g_n^{RB}$ is an unbiased estimator and function of the complete minimal sufficient statistic $\sum_{i=1}^n X_i$. Thus, by Lehmann-Scheffe, it is a UMVUE.

Namely, \begin{align} g_n^{RB} &= \mathbb{E}[g_n|\sum _{i=1}^n X_i =t]\\ &= \frac{\mathbb{P}(X_1 = 3) \mathbb{P}( \sum _{i=2}^n X_i =t - 3)}{\mathbb{P}( \sum _{i=1}^n X_i =t)}\\ & =\frac{e^{-\lambda}\lambda^3/3! \times e^{-\lambda (n-1)}\lambda^{t-3} (n-1)^{t-3} / (t-3)!}{e^{-\lambda n}\lambda^t n^t/t!} \\ & = \left( \frac{n-1}{n} \right)^{t} \binom{t}{3}(n-1)^{-3} \\ & = \left( 1 - \frac{1}{n} \right)^{n\bar{x}_n} \binom{n\bar{x}_n}{3}(n-1)^{-3}. \end{align} Now, for validation, you can use the continuous mapping theorem and the WLLN. Note that $\bar{X}_n \xrightarrow{p} \lambda$, same is true for $\frac{1}{n-1}\sum X_i \xrightarrow{p} \lambda$, and note that $(1-1/n)^n \xrightarrow{} e^{-1}$. Hence, combining it all, the estimator converges to $g(\lambda)$.

0
On

for n number of samples define an indicator function, $I(X_1)=1$ if $X_1=3$ and $0$ otherwise, notice that, $E(I(X_1))$ is an unbiased estimator of the given expression. now as $\sum_{i=1}^n X_i$ is the complete sufficient statistic for the poission family, using lehman scheffe theorem we can conclude that, $E(I(X_1)|\sum_{i=1}^n X_i=t)$ is the UMVUE of $\frac{\lambda^3 e^-\lambda}{3!}$

$E(I(X_1)|\sum_{i=1}^n X_i=t)$ = $P(X_1=3|\sum_{i=1}^n X_i=t)$ =$\frac{P(X_1=3,\sum_{i=2}^n X_i=t-3)}{P(\sum_{i=1}^n X_i=t)}$ =$\frac{P(X_1=3)P(\sum_{i=2}^n X_i=t-3)}{P(\sum_{i=1}^n X_i=t)}$ =$\frac{\frac{\lambda^3 e^{-\lambda}}{3!}\frac{((n-1)\lambda)^{t-3} e^{-(n-1)\lambda}}{(t-3)!}}{\frac{(n\lambda)^t e^{-n\lambda}}{t!}}$ =$\frac{t(t-1)(t-2)}{3!}\frac{(n-1)^{t-3}}{n^t}$

now replace all the t with $\sum_{i=1}^n X_i$ and that is your required UMVUE.