I have the following question out of a book. I even have the solution from solutions manual that I cannot really follow either, so I thought I would ask here to see if someone could dumb it down for me.
Let $Y_!, Y_2, ..., Y_n$ denote a random sample from a Poisson distribution with parameter $\lambda$. Show by conditioning that $\sum_{i=1}^{n}Y_i$ is sufficient for $\lambda$
So that is the question. From the solution I can see that they are using the conditional probability, but from the chapter that this question is from you are taught that you show sufficiency by using the likelihood. So what does it mean by "show by conditioning"? What does the act of conditioning do in this instance? Why would we use conditioning instead of likelihood? Also, just how would you do this question?
We have that \begin{equation} f(y_k,\lambda) = \frac{\lambda^{y_k}e^{-\lambda}}{y_k!} \end{equation} So the joint distribution of all your samples is \begin{equation} f(y_1 \ldots y_n,\lambda) = \prod_{i=1}^n \frac{\lambda^{y_k}e^{-\lambda}}{y_k!} \end{equation} which is, by collecting similar terms, \begin{equation} f(y_1 \ldots y_n,\lambda) = \frac{\lambda^{\sum_{k=1}^n y_k}e^{-n\lambda}}{y_1!y_2! \ldots y_n!} \end{equation} Now conditioning on the statistics $T = \sum Y_i$, we have \begin{equation} f(y_1 \ldots y_n,\lambda,T=t) = \frac{\lambda^t e^{-n\lambda}}{y_1!y_2! \ldots y_n!} \end{equation} Furthermore, we know that the sum of Poisson distributions with parameter $\lambda$ , is a Poisson distribution with parameter $n\lambda$, i.e. \begin{equation} g(t,\lambda) = \frac{(n\lambda)^t}{t!}e^{-n\lambda} \end{equation} We can say that the conditional distribution of the sample is \begin{equation} f(y_1 \ldots y_n \mid T=t, \lambda) = \frac{f(y_1 \ldots y_n,\lambda,T=t)}{g(t,\lambda)} = \frac{ \frac{\lambda^t e^{-n\lambda}}{y_1!y_2! \ldots y_n!}}{\frac{(n\lambda)^t}{t!}e^{-n\lambda}} = \frac{t!}{n^t y_1! \ldots y_n!} \end{equation} which is independent of $\lambda$. So, we say $T$ is a sufficient statistic.