Let $X = (X_1,X_2,\ldots,X_n)$ be a vector of iid Bernoulli random variables with parameter $\theta$.
We want to show that $T(X):= \sum_{i=1}^n X_i$ is a sufficient statistic for $\theta$.
This is what they did:
$$P\left(X=x\mid \sum_{i=1}^n X_i = t\right) = \frac{P(X_1 = x_1,X_2=x_2,\ldots,X_n=x_n,\sum_{i=1}^n X_i = t)}{P(\sum_{i=1}^n X_i=t)} \\ =\frac{\theta^{\sum_{i=1}^n x_i}(1-\theta)^{n-\sum_{i=1}^n x_i}}{\binom{n}t \theta^t (1-\theta)^{n-t}} = \frac{1}{\binom{n}t}.$$
What I don't understand is from the first equality to the second, it just looks like they used the fact that
$$P(X_1=x_1,\ldots,X_n=x_n) = P(X_1=x_1)\ldots P(X_n=x_n) = \theta^{\sum_{i=1}^n x_i} (1-\theta)^{n-\sum_{i=1}^n x_i} $$
... so it seems that
$$P(X_1=x_1,\ldots,X_n=x_n) = P(X_1=x_1)\ldots P(X_n=x_n) = P(X_1=x_1,\ldots,X_n=x_n) = P(X_1=x_1,\ldots ,X_n=x_n,\sum_{i=1}^n X_i = t)
$$
Can someone please explain this? Shouldn't there be another factor of
$$P\left(\sum_{i=1}^n X_i = t\right)$$
by independence as well?
By $$P\left(X_1=x_1,\ldots ,X_n=x_n,\sum_iX_i=t\right) $$ they do mean what it looks like: the probability that $X_1=x_1$ and $X_2=x_2,$ etc and that also $\sum_iX_i=t.$ And they do write the correct expression in the numerator except that there should be a constraint that $\sum_i x_i = t.$ So we can replace the occurrences of $\sum_i x_i$ with $t,$ which is how they get to the final expression (which should also have a constraint on it). To be explicit, we have $$P\left(X_1=x_1,\ldots ,X_n=x_n,\sum_iX_i=t\right) = \theta^{\sum_i x_i}(1-\theta)^{n-\sum_i x_i}1_{\sum_i x_i=t}= \theta^t(1-\theta)^{n-t}1_{\sum_i x_i=t}.$$
The reasoning behind this expression is that of course if $\sum_i X_i = t$ we are going to have zero unless $\sum_i x_i = t.$ And on the other hand, if we do meet that constraint, then the $\sum_i X_i=t$ is redundant, so we just have the same thing as $P(X_1=x_1,\ldots,X_n=x_n).$