I'm trying to build up an intuition about likelihoods and have come up with a few problems. This one builds on top of a previous problem: A likelihood problem with a biased coin
Suppose an unknown number of people flip an unfair coin $k$ times and $Z_i$ is the number of people who landed $i$ heads where $i\in{0,1,...,k}$. Now suppose only $Z_0$ is unknown (so we know $Z_1,...Z_k$).
I would like to find the likelihood $l(Z_1,...,Z_k|p)$ and so used the following argument:
Let $X_j ∼ Bin(N,π)$ where $X_j$ is the number of heads person $j$ landed. Suppose that observations are available for $Y_j = X_j| {X_j >0}$ ($Y_j$ follows a truncated binomial distribution). Consequently, we have the following probability mass for the observations $j =1, 2, . . . , N:$
$P(X_j=j|X_j>0)=\frac{{N \choose j}p^j(1-p)^{N-j}}{1-(1-p)^N}$
And so (I think):
$l(Z_1,...,Z_k|p)={\sum_{j=1}^kZ_j \choose Z_1,Z_2,...,Z_k}\Pi_{i=1}^k\Big({k \choose i}{\frac{ p^i(1-p)^{k-i}}{1-(1-p)^k}}\Big)^{Z_i}$
Is this correct?