Can we use Factorization criterion for proving given estimator is not Sufficient?

60 Views Asked by At

I was learning about the Sufficiency of Estimators and Factorization criteria. Now, I noticed that whenever we prove a given estimator is not sufficient, we use a counterexample with concrete values. My questions is why don't we use Factorization criterion to disprove the sufficiency also?
For example: If $X_{1}$ and $X_{2}$ are independent random variables having binomial distributions with the parameters $\theta$ and $n_{1}$ and $\theta$ and $n_{2}$, show that $\frac{X_{1}+2X_{2}}{n_{1}+2n_{2}}$ is not a sufficient estimator. The usual solutions involves giving some values to $x_{1}$ and $x_{2}$ and proving that the conditional distribution is not independent of the parameter $\theta$. But how to use Factorization theorem to prove this fact. I mean I tried factorize but couldn't do it from which we can conclude it is not sufficient estimator(note that a theorem has a iff in it). But why don't many books use this method to solve the problem?
By the way I am using John E Freunds'Mathematical Statistics and Applications for reference.