Find UMVUE of $P(X_1+X_2+X_3 =2)$ Where $X_1, ... X_n \sim$ Poisson$(\lambda)$ (independent)

110 Views Asked by At

Let $X_1, ... , X_n \sim $ Poisson$(\lambda)$ (independence is assumed) Find the UMVUE $T(X_1,...,X_n) $ of $g(\lambda) := P(X_1+X_2+X_3=2)$

So we have $g(\lambda) := P(X_1+X_2+X_3=2) = e^{-3\lambda}\frac{(3\lambda)^2}{2!}$

$S(\underline{X}) = \sum_{i=1}^n X_i$ is complete and sufficient. and $S(\underline{X}) \sim $ Poisson$(n\lambda)$

So I want to find $h$ such that $T(\underline{X}) = h(S(\underline{X}))$.

Setting up we have $\displaystyle E[h(S)] = e^{-3\lambda}\frac{(3\lambda)^2}{2!} \implies \sum_{k=0}^\infty \frac{e^{-n\lambda(n\lambda)^k}}{k!}h(S) = e^{-3\lambda}\frac{(3\lambda)^2}{2!}$

The next step I made was to get a sum on the RHS so I have and moved the $e^{-n\lambda} $ over:

$\displaystyle \sum_{k=0}^\infty \frac{(n\lambda)^k}{k!}h(k) = \sum_{k=0}^n\frac{((n-3)\lambda)^k}{k!}\frac{(3\lambda)^2}{2!} = \sum_{k=0}^\infty \frac{(n\lambda)^k}{k!} \bigg( \sum_{j=0}^k {k \choose j}(n\lambda)^{-j}(-3\lambda)^{j}\frac{(3\lambda)^2}{2!} \bigg)$

However, I don't think I have computed this correctly, from what I have above I should be able match to get this $T(\underline{X})$

Looking for input if I did this correctly.

EDIT: I have $h(S)$ but I should have $h(k)$ so I will change this

2

There are 2 best solutions below

0
On BEST ANSWER

You want to solve for $h(\cdot)$ where for every $\lambda>0$,

$$E[h(S)]=\sum_{k=0}^\infty h(k)\frac{e^{-n\lambda}(n\lambda)^k}{k!}=c\,e^{-3\lambda}\lambda^2$$

for some positive constant $c$.

That is,

$$ \sum_{k=0}^\infty \frac{h(k)n^k}{k!}\lambda^k=c\,e^{(n-3)\lambda}\lambda^2 =c\sum_{j=0}^\infty \frac{(n-3)^j}{j!}\lambda^{j+2} =\sum_{k=2}^\infty \frac{c(n-3)^{k-2}}{(k-2)!}\lambda^{k} $$

Now compare coefficients of $\lambda^k$.

0
On

$\newcommand{\bbx}[1]{\,\bbox[15px,border:1px groove navy]{\displaystyle{#1}}\,} \newcommand{\braces}[1]{\left\lbrace\,{#1}\,\right\rbrace} \newcommand{\bracks}[1]{\left\lbrack\,{#1}\,\right\rbrack} \newcommand{\dd}{\mathrm{d}} \newcommand{\ds}[1]{\displaystyle{#1}} \newcommand{\expo}[1]{\,\mathrm{e}^{#1}\,} \newcommand{\ic}{\mathrm{i}} \newcommand{\mc}[1]{\mathcal{#1}} \newcommand{\mrm}[1]{\mathrm{#1}} \newcommand{\pars}[1]{\left(\,{#1}\,\right)} \newcommand{\partiald}[3][]{\frac{\partial^{#1} #2}{\partial #3^{#1}}} \newcommand{\root}[2][]{\,\sqrt[#1]{\,{#2}\,}\,} \newcommand{\totald}[3][]{\frac{\mathrm{d}^{#1} #2}{\mathrm{d} #3^{#1}}} \newcommand{\verts}[1]{\left\vert\,{#1}\,\right\vert}$ By definition, the answer is given by \begin{align} &\bbox[10px,#ffd]{\sum_{k_{1} = 0}^{\infty}{\lambda^{k_{1}}\expo{-\lambda} \over k_{1}!} \sum_{k_{2} = 0}^{\infty}{\lambda^{k_{2}}\expo{-\lambda} \over k_{2}!} \sum_{k_{3} = 0}^{\infty}{\lambda^{k_{3}}\expo{-\lambda} \over k_{3}!}\ \overbrace{\bracks{z^{2}}z^{k_{1} + k_{2} + k_{3}}}^{\ds{\large \delta_{k_{1} + k_{2} + k_{3}, 2}}}} \\[5mm] = &\ \expo{-3\lambda}\bracks{z^{2}}\bracks{\sum_{k = 0}^{\infty}{\pars{\lambda z}^{k} \over k!}}^{3} \\[5mm] = &\ \expo{-3\lambda}\ \underbrace{\bracks{z^{2}}\expo{3\lambda z}} _{\ds{{\pars{3\lambda}^{2} \over 2!}}} = \bbx{\large{9 \over 2}\,\lambda^{2}\expo{-3\lambda}} \\ & \end{align}