Conditional expectation of $\mathbf1[X_1=0]\mid X_1+\cdots+X_n$ where $X_i$'s are i.i.d Poisson RVs

74 Views Asked by At

Let $X_1,\cdots,X_n$ be i.i.d Poisson random variables with mean $\mu$. Let $S=X_1+X_2+ \dots + X_n$. Set $Y=\mathbf{1}[X_1=0]$. Show that $$\mathbb{E}(Y\mid S)=\left(1-\frac{1}{n}\right)^S$$

$$\mathbb{E}(Y\mid S) = \sum y \frac{P(Y,S)}{P(S)}$$

$S$ is of course Poisson with mean $\mu n$. I am stuck here.

2

There are 2 best solutions below

0
On BEST ANSWER

Since $Y$ is just an indicator variable,

\begin{align} E(Y\mid S=s)&=P(X_1=0\mid S=s) \\&=\frac{P(X_1=0,S=s)}{P(S=s)} \\&=\frac{P\left(X_1=0,\sum_{i=2}^nX_i=s\right)}{P(S=s)} \\&=\frac{P(X_1=0)P\left(\sum_{i=2}^nX_i=s\right)}{P\left(\sum_{i=1}^nX_i=s\right)} \end{align}

You know that $\sum_{i=1}^n X_i\sim\mathcal P(n\mu)$ due to independence of $X_1,X_2,\cdots,X_n$.

It follows on simplification that $$E(Y\mid S=s)=\left(1-\frac{1}{n}\right)^s$$

If we are estimating the unknown parameter $μ$, then this conditional expectation indeed gives us the UMVUE of $P(X_1=0)=e^{−\mu}$.

1
On

Note that the expression $ \mathbb{E}(Y ~|~ S) $ can be significantly simplified. Since, $Y$ takes only two values ($0$ and $1$) then we have $$ \mathbb{E}(Y ~|~ S) = \mathbb{P}(Y = 1 ~|~ S) = \mathbb{P}(X_1 = 0 ~|~ S) \equiv \mathbb{P}(X_1 = 0 ~|~ X_1 + \dots + X_n = S) $$ For the latter we use Bayes theorem along with the properties of Poisson distribution. $$ \mathbb{P}(X_1 = 0 ~|~ X_1 + \dots + X_n = S) = \frac{\mathbb{P}(X_1 = 0 \cap X_2 + \dots + X_n = S)} {\mathbb{P}( X_1 + \dots + X_n = S)} = \frac{e^{-\mu} \cdot (\mu(n -1))^S e^{-\mu(n-1)}}{(\mu n)^S e^{-\mu n}} = \left(\frac{n-1}{n}\right)^S, $$ as required.