UMVUE for $λ^{−\lambda}$ where $_1, _2$ are two independent observations from a Poisson$(λ)$ and $λ > 0$ is an unknown parameter.

72 Views Asked by At

I am only familiar with finding UMVUE for $λ^{−\lambda}$ when $X_1,X_2,...,X_n \sim$ Poisson$(λ)$.

Where if $P(X=t)= \frac{e^{−\lambda}λ^{t}}{t!}$, I let $T = T(X) = I(X_1=t)$. Then $T$ is unbiased for $g(λ)$.

And since $∑X_i$ is a complete sufficient statistics for $λ$. $E(T|∑X_i)$ is UMVUE. The UMVUE is zero if $y=∑X_i<t$ and for $y \geq t$

\begin{align*}E(T|∑X_i=y) & = P(X_1=t|∑X_i=y) = \frac{P(X_1=t,∑X_i=y)}{P(∑X_i=y)} \\ & = \frac{P(X_1=t)P(\sum_{I=2}^n X_i=y-t)}{{P(\sum_{i=1}^n X_i=y)}} \\ &= \frac{(e^{-\lambda} λ^t/t!)([(n-1)λ]^{y-t}e^{-(n-1)\lambda}/(y-t)!)}{(nλ)^ye^{-n\lambda}/y!} \\ &= \frac{y!}{t!(y-t)!} (n-1)^{y-t}n^{-y}.\end{align*}

Thus UMVUE of $λe^{-\lambda}$ is $$\left(\frac{Y}{n}\right)[(n-1)/n]^{Y-1}.$$

I was wondering how can I find the same UMVUE for $λe^{-\lambda}$ when $_1, _2$ are two independent observations from a Poisson$(λ)$.

It would be greatly appreciated for a solution.

1

There are 1 best solutions below

1
On BEST ANSWER

Set

$$T=\mathbb{1}_{\{1\}}(X_1)$$

$T$ is clearly unbiased for $\lambda e^{-\lambda}$ thus using Rao Blacwell and Lehmann Scheffé together you know that the UMVUE is

$$\mathbb{E}[T|X_1+X_2=s]$$

This conditional expectation can be easy calculated using the definition of Conditional probability

$$\begin{align} \mathbb{E}[T|X_1+X_2=s] & =\mathbb{P}[T=1|X_1+X_2=s]\\ &=\frac{\lambda e^{-\lambda}\mathbb{P}[X_2=s-1]}{\mathbb{P}[X_1+X_2=s]}\\ &= \frac{\lambda e^{-\lambda}\frac{e^{-\lambda}\lambda^{s-1}}{(s-1)!}}{\frac{e^{-2\lambda}(2\lambda)^s}{s!}}\\ &=\frac{s!}{2^s(s-1)!}=\frac{s}{2^s} \end{align}$$