A random sample $X_{1},...,X_{n}$ are pulled from a gamma distribution. Are there jointly sufficient statistics based on these observations for the two unknown parameters?
The definition of a gamma distribution is f(x;$\alpha$,$\beta$)=$\frac{x^{\alpha-1 }}{\beta ^\alpha \Gamma(x){}}e^{\frac{-x}{\beta }}$
I kind of understand what a Jointly Sufficient Statistic is however I am not sure what to do from here. Possibly taking the product $\prod_{i=1}^{n}$ in front of the distribution. Can anybody help? Thanks!
First of all about the sufficient statistic, according to Wiki:
Here we have $\theta=\{\alpha,\beta\}$. In our case: $$\begin{align}f(\vec{x})=f(x_1,\ldots,x_n) &= \prod_{i=1}^n \left({1 \over \Gamma(\alpha) \beta^{\alpha}} x_i^{\alpha -1} e^{-\frac{x_i} {\beta}} \right)= {1 \over \Gamma(\alpha)^n \beta^{n\alpha}}\left(\prod_{i=1}^n x_i\right)^{\alpha-1} e^{-{1 \over \beta} \sum_{i=1}^n{x_i}}.\end{align} \tag{1}$$ We can assume that $h(\vec{x})=1$ then the whole right hand part of $(1)$ is $g_{\alpha,\beta}(T(\vec{x}))$, i.e. $$g_{\alpha,\beta}(T(\vec{x}))= {1 \over \Gamma(\alpha)^n \beta^{n\alpha}}\left(\prod_{i=1}^n x_i\right)^{\alpha-1} e^{-{1 \over \beta} \sum_{i=1}^n{x_i}}.$$ And since $g_{\alpha,\beta}(T(\vec{x}))$ depends on the drawn sample only through $\prod_{i=1}^n x_i$ and $\sum_{i=1}^n{x_i}$ then they are the sufficient statistics, i.e. $$T(\vec{x})=\left(\prod_{i=1}^n x_i, \ \sum_{i=1}^n{x_i}\right).$$