I need to find an approximation, from which I can easily sample, to the following compounded Binomial distribution: $X \sim \mathrm{Binomial}(e^{-\epsilon}, \ n)$ where $\epsilon \sim \mathrm{Gamma}(\alpha, \beta)$
I know that if $e^{-\epsilon}$ were a Beta then $X$ would be a Beta-Binomial.
Moreover, I know that $N\ \mathrm{Beta(\alpha, N)} \rightarrow \mathrm{Gamma}(\alpha, 1)$ in distribution.
However $e^{-\epsilon}$ is not a Gamma and I cannot use this nice approximation. $\epsilon$ has parameters such that its mode is around 0.1 so I thought I could use a Taylor expansion of $e^{-\epsilon}$, but it does not help in making my compound Binomial tractable.
Since I want to sample from this in a particle filter, I don't want to use any importance sampling (or related) because I need something quite fast, so I'm looking for a known distribution which could be a good approximation.
EDIT 1: I think I've found an answer to my problem. Notice that $e^{-\epsilon} \in ]0, 1]$. So it's the support of a Beta. Natural idea is to minimize the KL divergence between the distribution of $e^{-\epsilon}$ (we know its density its just an easy change of variable) and a Beta. If we denote $P=e^{-\epsilon}$ and $q_P(p)=\frac{\beta^\alpha}{\Gamma(\alpha)}(-\log p)^{\alpha-1}p^{\beta-1}$ its density and if we denote $q(\ \cdot \ ; a, b)$ the density of a $\mathrm{Beta}(a,b)$ we have:
\begin{equation*} \begin{split} D_{KL}(a,b) & = \int_{0}^{1}q_P(p)\frac{q_P(p)}{q(p; a, b)}\mathrm{d}p \\ & \propto - \int_{0}^{1}q_P(p)\log(q(p; a, b))\mathrm{d}p \\ & \propto - \int_{0}^{1}q_P(p)\log(\frac{\Gamma(a+b)}{\Gamma(a)\Gamma(b)}p^{a-1}(1-p)^{b-1})\mathrm{d}p \\ & \propto -\log\Gamma(a+b) + \log\Gamma(a) + \log\Gamma(b) \\ & \qquad + (a-1)\int_{0}^{1}(-\log p)\frac{\beta^\alpha}{\Gamma(\alpha)}(-\log p)^{\alpha-1}p^{\beta-1}\mathrm{d}p \\ & \qquad - (b-1)\int_{0}^{1}\log (1-p)\frac{\beta^\alpha}{\Gamma(\alpha)}(-\log p)^{\alpha-1}p^{\beta-1}\mathrm{d}p \end{split} \end{equation*}
Now notice that $-\log p(-\log p)^{\alpha-1}p^{\beta-1}= (-\log p)^{\alpha}p^{\beta-1}$ is proportional to the density of $e^{-X}$ where $X \sim \mathrm{Gamma}(\alpha+1, \beta)$ and if we denote $K=\int_{0}^{1}\log(1-p)\frac{\beta^\alpha}{\Gamma(\alpha)}(-\log p)^{\alpha-1}p^{\beta-1}\mathrm{d}p$ we have:
\begin{equation*} \begin{split} D_{KL}(a,b) & = \propto -\log\Gamma(a+b) + \log\Gamma(a) + \log\Gamma(b) \\ & \qquad + (a-1)\frac{\beta^\alpha}{\Gamma(\alpha)}\frac{\Gamma(\alpha+1)}{\beta^{\alpha+1}} - (b-1)K \\ & \propto -\log\Gamma(a+b) + \log\Gamma(a) + \log\Gamma(b) + (a-1)\frac{\alpha}{\beta} - (b-1)K \end{split} \end{equation*}
Now we minimize by finding a critical point: \begin{equation} \frac{\partial }{\partial a}D_{KL} = -\psi^{(0)}(a+b) + \psi^{(0)}(a) + \frac{\alpha}{\beta} = 0 \end{equation} \begin{equation} \frac{\partial }{\partial b}D_{KL} = -\psi^{(0)}(a+b) + \psi^{(0)}(b) - K = 0 \end{equation} where $\psi^{(0)}$ is the digamma function. $K$ is easily calculated numerically and so is the system of two equations above
I did a numerical example with $\alpha = \beta = 10$ and found $a = 6.739281$ and $b=10.736471$. I sampled 10000 times from $e^{-\epsilon}$ and the Beta with coefficient $a$ and $b$ and drew the following QQ-plot which seems to indicate that the approximation is accurate and I can use a Beta-Binomial to sample from instead of from $\mathrm{Binomial}(e^{-\epsilon}, n)$:

The density of $\pi = e^{-\epsilon}$ where $\epsilon \sim \operatorname{Gamma}(a,b)$ is parametrized by rate, is given by $$f_\pi(p) = \frac{b^a (-\log p)^{a-1} p^{b-1}}{\Gamma(a)}, \quad 0 < x < 1.$$ Therefore $$\begin{align*} \Pr[X = x] &= \int_{p=0}^1 \Pr[X = x \mid \pi = p] f_\pi(p) \, dp \\ &= \binom{n}{x} \frac{b^a}{\Gamma(a)} \int_{p=0}^1 p^{x+b-1} (1-p)^{n-x} (-\log p)^{a-1} \, dp \\ &= \binom{n}{x} \frac{b^a}{\Gamma(a)} \int_{v=0}^\infty e^{-(x+b-1)v} (1-e^{-v})^{n-x} v^{a-1} e^{-v} \, dv. \end{align*}$$ At this point we can see that if $n, x$ are nonnegative integers (which of course they are), we could expand the integrand, integrate term by term, and obtain a finite linear combination of gamma functions in terms of $a$ and $b$. The resulting formula is lengthy but closed form and programmable.
$$\begin{align*} \Pr[X = x] &= \binom{n}{x} \frac{b^a}{\Gamma(a)} \sum_{k=0}^{n-x} \binom{n-x}{k} (-1)^k \int_{v=0}^\infty v^{a-1} e^{-(x+b+k)v} \, dv \\ &= \binom{n}{x} \frac{b^a}{\Gamma(a)} \sum_{k=0}^{n-x} \binom{n-x}{k} (-1)^k \frac{\Gamma(a)}{(x+b+k)^a} \\ &= \binom{n}{x} b^a \sum_{k=0}^{n-x} \binom{n-x}{k} \frac{(-1)^k}{(x+b+k)^a}. \end{align*}$$
To utilize this result, it is probably best to choose your parameters $n, a, b$ in advance and compute the probability mass function for each $x = 0, 1, 2, \ldots, n$, provided that $n$ is not huge.