As I'm reading E.T. Jaynes' Probability Theory and while trying to do some calculations by myself (in this particular case, re-derive equation 6.15) I'm running into a problem where I just can't find my mistake:
We have an urn containing N balls in total of which R are red and the remaining N-R are white. N is known and we draw n balls at random of which r are red. Given the outcome of our draws we now try to estimate R using Bayes theorem:
$$ \underbrace{p(R | r, n, N)}_{\text{posterior}} = \underbrace{p(R | N)}_{\text{prior}} \frac{\overbrace{p(r | n, N, R)}^{\text{sampling distribution}}}{\underbrace{p(r | n, N)}_{\text{marginalization}}} $$
Using a uniform prior and the hypergeometric distribution as the sampling distribution:
\begin{align*} \text{prior} &= \frac{1}{N+1} \\ \text{sampling distribution} &= \frac{{{R}\choose{r}} {{N-R}\choose{n-r}}}{{{N}\choose{n}}} \\ \text{marginalization} &= \frac{\sum_{R=0}^N {{R}\choose{r}} {{N-R}\choose{n-r}}}{{{N}\choose{n}}} \stackrel{\text{vandermonde's idenity}}{=} \frac{{{N+1}\choose{n+1}}}{{{N}\choose{n}}} \end{align*}
the ${N \choose n}$ cancel out and I get: $$ p(R | r, n, N) = \frac{1}{N+1} \frac{{{R}\choose{r}} {{N-R}\choose{n-r}}}{{{N+1}\choose{n+1}}}$$
But this is not properly normalized. The prior $\frac{1}{N+1}$ should cancel out too somehow to arrive at the correct result of just $\frac{{{R}\choose{r}} {{N-R}\choose{n-r}}}{{{N+1}\choose{n+1}}}$.
I just don't see how. What am I missing/what is wrong?
And just after posting the question, a friend pointed out the error to me: the marginalization should be: $$\sum_{R=0}^N p(r,R|n,N) = \sum_{R=0}^N p(r|R,n,N) \underbrace{p(R|n,N)}_{=p(R|N)}$$