Using Rao-Blackwell to find UMVUE for $\lambda^2e^{-\lambda}$ for Poisson Distribution

741 Views Asked by At

I'm not sure if I'm going about this correctly. Can someone please confirm or point me in the right direction?

Let $X_1,...,X_n \sim Poisson (\lambda)$.

I want to find the UMVUE for $\lambda^2e^{-\lambda}$.

First, I find the sufficient statistic by finding the likelihood.

$L(\lambda|x)=\frac{e^{-n\lambda}\lambda^{\sum{x_i}}}{\prod{x_i!}}$. By factorization, $\sum{x_i}$ is a sufficient statistic for $\lambda$. The distribution of the sufficient statistic $T=\sum{X_i}\sim Poisson(n\lambda).$

To find the unbiased estimator, I let $W(X)$ be the following:

\begin{equation} W(X) = \left \{ \begin{aligned} &1, && \text{if}\ x_i=2 \\ &0, && \text{otherwise} \end{aligned} \right. \end{equation}

Therefore, I have $\frac{\lambda^2e^{-\lambda}}{2!}$. To make this unbiased, I need to multiply by $2.$

I have: $E(W(X))=P(X=2)(1)(2)+P(X\ne2)(0)=\lambda^2e^{-\lambda}$ so unbiased.

By Rao-Blackwell, $\phi(T)=E(W|T)=\frac{P(2*x_2=1,\sum_{i=2}^{n-1}{x_i=t-2})}{P(\sum{x_i=t})} = \frac{2 * \frac{\lambda^2e^{-\lambda}}{2!} * \frac{e^{-{(n-1)\lambda}}(n-1)\lambda^{t-2}}{(t-2)!}}{\frac{e^{-n\lambda}(n\lambda)^{t}}{t!}} =\frac{t!(n-1)^{t-2}}{(t-2)!n^t}=\frac{\sum{x_i}!(n-1)^{\sum{x_i}-2}}{(\sum{x_i}-2)!n^\sum{x_i}}$

For the last part, I just replaced the ts with the sufficient statistic. So that's the UMVUE of $\lambda^2e^{-\lambda}$. Does my process make sense? Is it correct that I multiply the indicator function by 2 to make it unbiased?

1

There are 1 best solutions below

0
On BEST ANSWER

I haven't done this in a while, so I apologize if I make any errors. You are definitely mostly on the right track. Your notation is probably a bit sloppy in places.

Define

\begin{align*} W = \begin{cases} 2 \text{ if } x_1 = 2\\ 0 \text{ otherwise}. \end{cases} \end{align*}

\begin{align*} E(W) &= 2P(X_1 = 2)\\ &= 2 \cdot \frac{\lambda^2 e^{-\lambda}}{2!} = \lambda^2 e^{-\lambda}. \end{align*}

So $W$ is unbiased for $\lambda^2 e^{-\lambda}$. You already have the sufficient statistic $T$ and you know its distribution.

\begin{align*} E(W \mid T = t) &= \frac{2P(X_1 = 2 \cap T = t)}{P(T = t)}\\ &= 2\frac{P(X_1 = 2)P(\sum_{i = 2}^n X_i = t - 2)}{P(T = t)}\\ &= \lambda^2 e^{- \lambda} \cdot \frac{[(n - 1)\lambda]^{t - 2}e^{- (n - 1)\lambda}}{(t - 2)!} \cdot \frac{t!}{(n\lambda)^t e^{-n\lambda}}. \end{align*}

The rest is cleanup. By Lehmann-Scheffé this should be UMVUE.