For $X_i$ iid $Poisson(\lambda)$, I'm struggling to evaluate $\mathbb{E}\big[\frac{1}{n}\sum_{i=1}^nX_i^2|\sum_{i=1}^nX_i\big]$. From this question, I know that $E(X_j|\sum_{i=1}^nX_i)=\frac{1}{n}\sum_{i=1}^nX_i$. I've also shown that $\mathbb{E}[X^2]=\lambda^2+\lambda$, but I'm not sure where to go now.
Conditional Expectation of Second Moment Poisson
737 Views Asked by Bumbble Comm https://math.techqa.club/user/bumbble-comm/detail AtThere are 2 best solutions below
On
Write $t(X_1,...,X_n) := \sum_i X_i$. Then you can use the multinomial theorem to find that the probability that $t$ is a fixed value is $\frac{\lambda^t\cdot n^t}{e^{\lambda\cdot n}t!}$. Therefore we have
$$E[X_1^2| t] = \frac{t!}{n^t}\sum_{\sum x_i = t}\frac{x_1^2}{x_1!...x_n!} = \frac{t!}{n^t}\sum_{x_1 = 0}^t\frac{x_1^2}{x_1!}\sum_{\sum_{i\geq 2}x_i = t-x_1}\frac{1}{x_2!...x_n!} = $$
$$ \frac{t!}{n^t}\sum_{x_1 = 0}^t\frac{x_1^2}{x_1!}\cdot \frac{(n-1)^{t-x_1}}{(t-x_1)!}$$
where I used the multinomial theorem in the last equality. This is the same as
$$ \frac{t!}{n^t}\sum_{x = 1}^t\frac{x^2}{x!}\cdot \frac{(n-1)^{t-x}}{(t-x)!}= \frac{t!}{n^t}\sum_{x = 1}^t\frac{x}{(x-1)!}\cdot \frac{(n-1)^{t-x}}{(t-x)!} = \frac{t!}{n^t}\sum_{x = 0}^{t-1}\frac{(x+1)}{x!}\cdot \frac{(n-1)^{(t-1)-x}}{((t-1)-x)!}$$
$$= \frac{t!}{n^t}\sum_{x = 0}^{t-1}\frac{x}{x!}\cdot \frac{(n-1)^{(t-1)-x}}{((t-1)-x)!} + \frac{t!}{n^t}\sum_{x = 0}^{t-1}\frac{1}{x!}\cdot \frac{(n-1)^{(t-1)-x}}{((t-1)-x)!}=$$
$$ \frac{t!}{n^t}\sum_{x = 0}^{t-2}\frac{1}{x!}\cdot \frac{(n-1)^{(t-2)-x}}{((t-2)-x)!} + \frac{t!}{n^t}\sum_{x = 0}^{t-1}\frac{1}{x!}\cdot \frac{(n-1)^{(t-1)-x}}{((t-1)-x)!} $$ Now if we multiply the first sum by $\frac{(t-2)!}{(t-2)!}$ and the second by $\frac{(t-1)!}{(t-1)!}$ and use the binomial theorem we get
$$ \frac{t!}{n^t\cdot (t-2)!}\sum_{x = 0}^{t-2} {(n-1)^{(t-2)-x}}{t-2\choose x}+ \frac{t!}{n^t\cdot (t-1)!}\sum_{x = 0}^{t-1} {(n-1)^{(t-1)-x}}{t-1\choose x}=$$
$$\frac{t!\cdot n^{t-2}}{n^t\cdot (t-2)!}+\frac{t!\cdot n^{t-1}}{n^t\cdot (t-1)!}$$
or $$\frac{t^2-t+tn}{n^2} $$
which agrees with heropup's answer.
Consider the conditional distribution of $X_1$ given $n \bar X$, where $\bar X$ is the sample mean. We know that in the case of $n = 2$ for general rates $\lambda_1$, $\lambda_2$, this is binomial; e.g., $$\Pr[X_1 = x \mid X_1 + X_2 = s] = \frac{\Pr[X_1 = x]\Pr[X_2 = s-x]}{\sum_{k=0}^s \Pr[X_1 = k]\Pr[X_2 = s-k]} = \binom{s}{x} \left(\frac{\lambda_1}{\lambda_1+\lambda_2}\right)^x \left(1 - \frac{\lambda_1}{\lambda_1+\lambda_2}\right)^{s-x},$$ so by induction on $n$ we easily get $$\Pr[X_1 = x \mid n\bar X = s] = \binom{s}{x} p^x (1 - p)^{n-x}, \quad p = \frac{1}{n}.$$ It follows that the variance of $Y = X_1 \mid n\bar X$ is simply $sp(1-p) = s(n-1)/n^2$, hence the second moment of $Y$ is $$\operatorname{E}[X_1^2 \mid n\bar X = s] = \frac{s(s+n-1)}{n^2}.$$ Now by linearity of expectation, $$\operatorname{E}\left[\frac{1}{n}\sum_{i=1}^n X_i^2 \operatorname{\Bigg|} n\bar X\right] = \frac{n\bar X (n\bar X + n - 1)}{n^2} = \bar X \left(\bar X + 1 - \frac{1}{n}\right).$$