MVUE for Poisson distribution and Stirling numbers

1.1k Views Asked by At

I am trying to do the following statistics exercise:

Let $x_1,...,x_n$ be a random sample from a Poisson distribution (a probability measure on $\mathbb{Z}^{\geq 0}$ for which the integer $k$ has probability $\frac{\theta^k}{k!}e^{-\theta}$) with unknown mean $\theta$. Find a minimum variance unbiased estimator for $e^{-\theta}$, the probability of the zero class.

I found a statistic $$t(x_1,\cdots,x_n) := \sum_i x_i$$ which seems to be sufficient and complete, and I also found the unbiased estimator $$g(x_1,\cdots,x_n) := \frac{|\{i: x_i=0\}|}n,$$ so all I need to do is find the conditional expectation $E(g \mid t)$. If $$A_m:= \{(x_1,\cdots,x_n): \sum x_i =m\},$$ I find after using the multinomial theorem that the probability of $A_m$ is $$\frac{\theta ^m \cdot n^m}{e^{\theta\cdot n}m!}$$ Thus the average of $g$ over $A_m$ is the sum

$$\frac{m!}{n^m}\sum_{\substack{\sum_{i=1}^n x_i = m \\ x_i\geq 0}}\frac{|\{i: x_i=0\}|}n\cdot \frac{1}{x_1!\,x_2!\cdots x_n!} =\\ \frac{m!}{n^{m+1}}\sum_{j=0}^n \binom{n}{j}\cdot j\cdot \sum_{\substack{\sum_{i=1}^{n-j} y_i = m \\ y_i> 0}}\frac{1}{y_1!\cdots y_{n-j}!}.$$

Thanks to N. Shales' excellent answer here , I'm able to rewrite this as $$\frac{n!}{n^{m+1}}\sum_{j=1}^n \frac{1}{(j-1)!}\cdot {m\brace n-j} = \frac{n!}{n^{m+1}}\sum_{j=0}^{n-1} \frac{1}{(n-1-j)!}\cdot {m\brace j}.$$

So I get some kind of convolution of Stirling numbers with a factorial function. Now I do not know much about these "Stirling numbers," and at this point I am starting to wonder if I am doing this the right way. Can someone help me?

1

There are 1 best solutions below

1
On BEST ANSWER

$\def\e{\mathrm{e}}$Define$$ Y = \begin{cases} 1; & X_1 = 0\\ 0; & X_1 > 0 \end{cases}, $$ then $Y$ is an unbiased estimator of $\e^{-θ}$. Because $T_n = \sum\limits_{k = 1}^n X_k$ is sufficient and complete, and based on already calculated results,\begin{align*} E(Y \mid T_n = m) &= P(X_1 = 0 \mid T_n = m) = \frac{P(X_1 = 0, T_n = m)}{P(T_n = m)}\\ &= \frac{P(X_1 = 0) P\left( \sum\limits_{k = 2}^n X_k = m \right)}{P(T_n = m)}\\ &= \frac{\e^{-θ} \cdot \dfrac{(n - 1)^m θ^m}{m!\,\e^{(n - 1)θ}}}{\dfrac{n^m θ^m}{m!\,\e^{nθ}}} = \left( 1 - \frac{1}{n} \right)^m, \end{align*} where $0^0 := 1$, then $\left( 1 - \dfrac{1}{n} \right)^{T_n}$ is the UMVUE for $\e^{-θ}$.