Consider $M$ observations ($x_i$, $n_i$) where $x_i$ is a realisation from $X_i \sim \mbox{Binomial}(n_i,p_i)$ and $p_i$ is a realisation from $P_i \sim Beta(\alpha, \beta)$.
I would like to find the maximum likelihood estimates of $\alpha, \beta$.
I begin by writing
$$\prod_{i=1}^MPr(x_i|\alpha, \beta, n_i) = \prod_{i=1}^M \int_0^1Pr(x_i|p_i,\alpha, \beta, n_i)Pr(p_i|\alpha, \beta, n_i)dp_i$$
$$=\prod_{i=1}^M \int_0^1 {n_i \choose x_i} x_i^{p_i} (n_i-x_i)^{1-p_i} \frac{p_i^{\alpha-1} (1-p_i)^{\beta-1}}{B(\alpha,\beta)}dp_i$$
$$=B(\alpha,\beta)^{-M}\prod_{i=1}^M {n_i \choose x_i}\int_0^1 x_i^{p_i} (n_i-x_i)^{1-p_i} p_i^{\alpha-1} (1-p_i)^{\beta-1}dp_i$$
By noting that $Pr(x_i|p_i,\alpha, \beta, n_i)= Pr(x_i|p_i, n_i)$ which is $\mbox{Binomial}(n_i,p_i)$ and $Pr(p_i|\alpha, \beta, n_i)= Pr(p_i|\alpha, \beta)$ which is $Beta(\alpha, \beta)$.
However I cannot see a good way to proceed from here. I would be happy with a good numerical optimisation that could solve this in a relatively quick way. Is that possible?
The key mistake is incorrectly writing the Binomial distribution pdf as $Pr(x_i|p_i,n_i)={n_i \choose x_i} x_i^{p_i} (n_i-x_i)^{1-p_i}$, instead of $Pr(x_i|p_i,n_i)={n_i \choose x_i} p_i^{x_i} (1-p_i)^{n_i-x_i}$.
From here we can see our integrand is now of the form of a Beta distribution and thus integrates to $B(x_i + \alpha, n_i - x_i +\beta)$.
Considering now the log-likelihood $LL$, we see that
$$LL = -M\log B(\alpha,\beta) +\sum_{i=1}^M \left[ \log{n_i \choose x_i} + \log B(x_i + \alpha, n_i - x_i + \beta )\right]$$
So the partial derivatives are $$\frac{\partial LL}{\partial \alpha} = -M(\psi(\alpha) - \psi(\alpha + \beta)) + \sum_{i=1}^M \psi(x_i + \alpha) - \psi(n_i + \alpha + \beta)$$
$$\frac{\partial LL}{\partial \beta} = -M(\psi(\beta) - \psi(\alpha + \beta)) + \sum_{i=1}^M \psi(n_i -x_i + \beta) - \psi(n_i + \alpha + \beta)$$
where $\psi$ is the digamma function. We can now find our MLE estimates for $\alpha, \beta$ by setting the partial derivatives to $0$ and solving.