Suppose we have a collection of independent Boolean random variables $X_i$ and $Y_i$ (for $1 \le i \le N$), and are told $p_i = P(X_i = 1)$ for all $i$. We are now given a set of values $x_i$ that was produced either by $X_i$ or by $Y_i$ (but we don't know which).
We would like to determine how likely it is that the $x_i$ were produced by $X_i$ (that is, how well the known $p_i$ values predict the values we obtained). Is there any way to do that? Is the question even meaningful (given that we know nothing about the distribution of possible $Y_i$ distributions)?
(Some special cases have obvious results: if $p_i = 0$ and $x_i = 1$ for any $i$ we can be confident the $x_i$ were produced by $Y_i$, and if $p_i = x_i$ for all $i$ we can be at least 50% confident that the $x_i$ were produced by $X_i$.)
Let $q_i=P(Y_i=1)$. The variables $q_i$ are called nuisance parameters and to deal with them we have to put a prior on them and then average out over them.
To be more precise, the model is the following: we have parameters $p_1,\dots,p_N$ and $q_1,\dots,q_N$. We know the $p_i$, but we don't know the $q_i$ and so must put a prior distribution on them, say with p.d.f. $p(q_1,\dots,q_N)$. Then the $X_i$ and $Y_i$ have been generated as Bernoulli random variables with parameters $p_1,\dots,p_N,q_1,\dots,q_N$. Finally we have chosen the $x_i$ to all be equal to the $X_i$ or all be equal to the $Y_i$. Say $H$ is true if they are equal to the $X_i$s and false if they are equal to the $Y_i$s. We also need a prior probability on $H$, say $P(H)=1/2$
Then we wish to calculate $P(H|x_1,\dots,x_N)$. Applying Bayes' Theorem gives
$$P(H|x_1,\dots,x_N)=\frac{P(x_1,\dots,x_N|H)P(H)}{P(x_1,\dots,x_N|H)P(H)+P(x_1,\dots,x_N|¬H)P(¬H)}$$
Now, $P(H)=P(¬H)=1/2$ and $P(x_1,\dots,x_N|H)$ can be calculated from $p_1,\dots,p_N$. So it remains to calculate $P(x_1,\dots,x_N|¬H)$. If we knew the $q_i$ we could do this immediately. That is to say we know how to calculate $P(x_1,\dots,x_N|¬H,q_1,\dots,q_N)$. Then we apply the "law of total probability" to get:
$$P(x_1,\dots,x_N|¬H)=\int P(x_1,\dots,x_N|¬H,q_1,\dots,q_N)p(q_1,\dots,q_N)\mathrm{d}q_1\dots\mathrm{d}q_N$$
The only question that remains is what prior distribution $p(q_1,\dots,q_N)$ to choose. If I knew absolutely nothing about the $q_i$ then I would say that having them be i.i.d. uniformally distributed on $[0,1]$ would be a good description of my (lack of) knowledge. That is I would pick $p(q_1,\dots,q_N)=1$. But if you have extra information about the $q_i$, for example you might suspect that they might be correlated with each other, then you would have to pick a different prior $p(q_1,\dots,q_N)$ to describe your state of knowledge.