Consider the two-parameter Gamma($\alpha$,$\beta$) distribution with PDF
$$f(x|\alpha,\beta) = \frac{\beta^\alpha x^{\alpha - 1} \exp(-\beta x)}{\Gamma(\alpha)}, \quad x>0, \alpha>0, \beta>0,$$
where the scale parameter, $\beta$, can have any of a number of continuous prior distributions with PDF $g_i(\beta), \beta > 0$.
My question is:
For fixed $\alpha$, is the mixture PDF,
$$f(x) = \int_0^\infty f(x|\alpha,\beta) g_i(\beta) d\beta, \quad x>0,$$
identifiable with respect to $\beta$?
In other words, does $$\int_0^\infty f(x|\alpha,\beta) g_j(\beta) d\beta = \int_0^\infty f(x|\alpha,\beta) g_k(\beta) d\beta, \quad x>0 \qquad (1)$$
imply $$g_j(\beta) = g_k(\beta), \quad \beta>0?$$
I've looked for this result (or its negation) extensively, but with no luck. The closest thing I've found is a discussion of the identifiability of Gamma($\alpha$,$\beta$) mixtures with regard to the $\alpha$ parameter (for fixed $\beta$), which Maritz and Lwin (Empirical Bayes Methods with Applications) show are identifiable because Gamma($\alpha$,$\beta$) is an additively closed family with respect to $\alpha$.
Thanks in advance for any references or suggestions!
Addendum: I should have noted that for $\alpha = 1$ (i.e., the Exponential($\beta$) case), the mixture PDF,
$$f(x) = \int_0^\infty f(x|\alpha = 1,\beta) g_i(\beta) d\beta,$$
appears to be identifiable if the Laplace transforms of the $g_i(\beta)$ are well defined in a neighborhood of $0$. This is because Eq. (1) (and successive derivatives of both sides WRT $x$) can be used to show that all raw moments associated with $g_j(\beta)$ and $g_k(\beta)$ are identical (which unfortunately doesn't imply that the two PDFs are the same).