Let $X_1,\ldots, X_n$ be a random sample from $Bin(n, p)$.
(a) What is the distribution of $S = \sum\limits_{i=1}^{n} X_i$?
(b) Show that the conditional distribution of $(X_1,\ldots, X_n)$ given $S=s$ is independent of $p$.
Let $X_1,\ldots, X_n$ be a random sample from $Bin(n, p)$.
(a) What is the distribution of $S = \sum\limits_{i=1}^{n} X_i$?
(b) Show that the conditional distribution of $(X_1,\ldots, X_n)$ given $S=s$ is independent of $p$.
(a) A variable $X \sim Bin(n,p)$ has the same distribution of a sum of $n$ independent Bernoulli variables $B_k \sim Ber(p)$:
$S$ has then the same distribution of the sum of $n^2$ Bernoulli variables, i.e. $S \sim Bin(n^2,p)$
(b) The joint density reads:
$p(x_1,...,x_n|p)=\prod_i {n\choose x_i}p^{x_i}q^{n-x_i}$
where $0 \le x_i \le n$ and $q=1-p$. This can be rewritten:
$p(x_1,...,x_n|p)=(\prod_i {n\choose x_i}) p^{\sum_i x_i}q^{n^2-\sum_ix_i}$ [1]
So now we can invoke the Factorization theorem (https://online.stat.psu.edu/stat414/node/283/) that says that $S$ is a sufficient statistics for p, which is what we want to prove.
EDIT: conclusion without using the factorization theorem. $p(x_1,...,x_n|S=s)$ is zero if $x_1+...+x_n \ne s$ (and in particular does not depend on p). Suppose in the following that $x_1+...+x_n = s$.
$p(x_1,...,x_n|S=s)=\frac{p(x_1,...,x_n)}{p(S=s)}$ [2]
Now $p(S=s)={n^2\choose s}p^sq^{n^2-s}$ [3]
Intersting [1] and [3] into [2], and using that $x_1+...+x_n = s$ we see that $p$ and $q$ simplify, which is again the thesis.