This question is set in the statistical context, but my difficulty is more ‘pure math’ in nature, so I have posted it here instead of at the statistics forum.
I am to prove that $V := \sum^n_{i=1} X_i$ and $W := \prod^n_{i=1} X_i$ are sufficient statistics from the gamma distribution.
By the factorisation theorem, if the joint pdf of a series of random variables $\vec{X} := (X_1, \cdots, X_n)$ can be factorised in such a manner:
$$f(\vec{X}|\theta) = g(T(\vec{X}), \theta)\;h(\vec{X})$$
where $T(\vec{X})$ is the statistic; then $T(\vec{X})$ is a sufficient statistic for the series of random variables $\vec{X}$.
I have expressed the joint distribution of iid gamma variables as follows:
$$\begin{align*} f(\vec{X}|\alpha, \lambda) &= \prod^n_{i=1} f(X_i|\alpha, \lambda) \\ &= \lambda^{n\alpha}\;\Gamma(\alpha)^{-n}\;W^{\alpha-1}\;e^{-\lambda V} \end{align*}$$
Clearly, for the statistic $T(\vec{X}) = V$, the factorised form has $g(T(\vec{X}), \alpha, \lambda) = \lambda^{n\alpha}\; \Gamma(\alpha)^n\; e^{-\lambda V}$. In a similar fashion, the statistic $T(\vec{X}) = V$ has $g(T(\vec{X}), \alpha, \lambda) = \lambda^{n\alpha}\; \Gamma(\alpha)^n\; W^{\alpha-1}$.
But I cannot figure out how the last term can be expressed as just $h(\vec{X})$. For the former statistic $V$, the remaining term is $W^{\alpha-1}$, which depends on $\alpha$; the latter $W$ has the remaining term $e^{-\lambda V}$, which depends on $\lambda$. Yet I know for certain (from literature) that both $W$ and $V$ are indeed sufficient statistics.
Obviously, I must have some misconception here. Can someone correct me, please?
The pdf is $f(x|\alpha, \beta) = \frac{1}{\Gamma(\alpha) \beta^\alpha} x^{\alpha-1} e^{-\frac{x}{\beta}} \chi_{[0,\infty)}(x)$ : then the joint density is $$ f(\underline{X}|\alpha, \beta) = \frac{1}{\Gamma(\alpha)^n \beta^{n\alpha}} W^{\alpha-1} e^{-\frac{V}{\beta}} \prod_{i=1}^n \chi_{[0,\infty)}(x_i) $$ and you have the factorised form. Note that (V,W) are jointly sufficient, but not on their own.