Find the mean and variance of $Y= (1/n) \sum_{i=1}^n(X_i)$ from Poissons

98 Views Asked by At

Let $X_1, \ldots , X_n$ be a random sample of Poisson(λ) where λ > 0 is unknown. Let $Y =\tfrac 1 n \sum_{i=1}^n X_i$ be the sample mean.

(a) Find the mean and variance of Y .

(b) Find the MGF of Y .

(c) Can you use the result in (b) to find the distribution of Y ?

I know that if the $X_i$'s are independent. Then, the mean of the $Y$ is $(1/n) *(λ_1+...λ_n)$. However, can we say that these are independent? Then, I could calculate the variance by finding E(X^2).

Help is greatly appreciated!

2

There are 2 best solutions below

1
On BEST ANSWER

Yes, you may assume the samples are independent and identically distributed.   This being how sampling from a distribution is defined.

Then indeed Linearity of Expectation says: $\mathsf E(Y) ~{=\mathsf E(\tfrac 1n\sum_{i=1}^n X_i) \\=\tfrac 1n\sum_{i=1}^n\mathsf E(X_i)\\= \lambda}$

The mean of the sample mean is the mean of the distribution.

Then, I could calculate the variance by finding E(X^2).

Maybe, but that is unnecessary.   You should already know $\mathsf{Var}(X_i)$ since the samples do come from a Poisson Distribution of parameter $\lambda$.

So, you may now use the Bilinearity of Covariance along the identical distribution and uncorrelation of the samples.$$\mathsf {Var}(Y)=\sum_{i=1}^n\mathsf{Var}(\tfrac 1n X_i)+\require{cancel}\cancelto0{\underset{j\neq i}{\sum_{i=1}^n\sum_{j=1}^n}\mathsf{Covar}(X_i,X_j)~}$$

...

So, the variance of the sample mean is not the variance of the distribution.

5
On

I think it is safe to assume that the $X_i$ are independent, as otherwise we would need to be given information about their joint distribution (which isn't provided). So given that assumption, we have $$ \mathbb E[Y] = \mathbb E\left[\frac1n\sum_{i=1}^n X_i \right] = \frac1n\sum_{i=1}^n \mathbb E[X_i] = \frac1n\sum_{i=1}^n \lambda = \frac{n\lambda} n = \lambda, $$ and $$ \operatorname{Var}(Y) = \operatorname{Var}\left(\frac1n\sum_{i=1}^n X_i\right) = \frac1{n^2} \sum_{i=1}^n\operatorname{Var}(X_i) = \frac1{n^2}\sum_{i=1}^n\lambda_i=\frac{n\lambda}{n^2}=\frac{\lambda} n. $$ We compute the moment-generating function $M_n(t)$ by \begin{align} M_n(t) &= \mathbb E[e^{tY}]\\ &= \mathbb E\left[e^{t\frac1n\sum_{i=1}^n X_i}\right]\\ &= \prod_{i=1}^n \mathbb E[e^{t/nX_i}]\\ &= \mathbb E[e^{t/nX_1}]^n \end{align} where \begin{align} \mathbb E[e^{tX_1}] &= \sum_{k=0}^\infty e^{tk}e^{-\lambda}\frac{\lambda^k}{k!}\\ &=e^{-\lambda}\sum_{k=0}^\infty \frac{(\lambda e^t)^k}{k!}\\ &=e^{-\lambda}e^{\lambda_i e^t}\\ &=e^{\lambda(e^t-1)}. \end{align} Hence $$ M_n(t) = e^{n\lambda(e^{t/n}-1)}. $$ We cannot easily find the distribution of $Y$ from the moment-generating function, but note that $\sum_{i=1}^n X_i$ has moment-generating function $e^{n\lambda(e^t-1)}$, from which $\sum_{i=1}^n X_i\sim\operatorname{Pois}(n\lambda)$. So we have for each nonnegative integer $k$ $$ \mathbb P\left(Y=\frac kn\right) = \mathbb P\left(\sum_{i=1}^n X_i = k \right) = e^{-n\lambda}\frac{(n\lambda)^k}{k!}. $$