Show $T$ is a sufficient statistic using definition of sufficiency

41 Views Asked by At

Suppose I have two independent variables $X_i \sim Gamma(i\theta, 1)$ for $i=1,2$.

Show that $T= X_1 + X_2$ is a sufficient statistic using the definition of sufficient statistics (not the factorization criteria).

I am using the definition from Casella & Berger:

  • If $p(\mathbf{x}|\theta)$ is the joing pdf or pmt of $\mathbf{X}$ and $q(t|\theta)$ is the pdf or pm of $T(\mathbf{X})$, then $T(\mathbf{X})$ is a sufficient statistic for $\theta$ if, for every $\mathbf{x}$ in the sample space, the ration $\frac{p(\mathbf{x}|\theta)}{q(T(\mathbf{X}|\theta))}$ is a constant function of $\theta$.

Now clearly, $T \sim Gamma(3\theta, 1)$

Thus

\begin{align*} f(X_1, X_2|T)= & \frac{f_{X_1}(X_1=x_1)f_{X_2}(X_2 = x_2)}{f_{T}(T=t)}\\ & \\ = & \frac{\left ( \frac{\exp(-x_1 -x_2) x_1 ^{\theta-1}x_2^{2\theta-1}}{\Gamma(\theta)\Gamma(2\theta)}\right )}{\left ( \frac{\exp(-x_1 -x_2) (x_1+x_2 )^{3\theta-1}}{\Gamma(3\theta)}\right )}\\ & \\ = & \frac{\Gamma(3\theta)x_1 ^{\theta-1}x_2^{2\theta-1}}{\Gamma(\theta)\Gamma(2\theta)(x_1 + x_2)^{3\theta - 1} } \end{align*}

Now this still looks dependent on $\theta$ to me. Any ideas on what I did wrong?