Source of the question: the exercise 6.22(a) of Statistical Inference Book by Casella and Berger.
Let $X_1,...,X_n$ be a random sample from a population with pdf
$$f(x|\theta)=\theta x^{\theta-1},0<x<1, \theta>0.$$
Is $\Sigma X_i$ sufficient for $\theta$?
I know how to find a sufficient statistics. I can use either factorization theorem or exponential family way. One sufficient statistics is $\prod_i X_i$. I also know how to find other sufficient statistics by using any function of $\prod_i X_i$. But I am considering whether there exists a more rigorous way to conclude $\Sigma X_i$ is not sufficient.
It follows from the factorization theorem that a minimal sufficient statistic is a function of any sufficient statistic, and that $\prod X_i$ is minimal sufficient. As such it must hold that
$$\prod X_i = f( \sum X_i) \mbox{ for some } f.$$
Let $S_1: \mathbb{R}^n \to \mathbb{R}$ be defined by $S_1({\bf{X}}) = \prod X_i$, ${\bf X} = (X_1,...,X_n)$, and $S_2: \mathbb{R}^n \to \mathbb{R}$ be defined by $S_2({\bf{X}}) = \sum X_i$.
Evidently if the above relation holds, then we must have that for all ${\bf X_1}, {\bf X_2}$, which are vectors of variables in $[0,1]$, then $S_2({\bf X_1})=S_2({\bf X_2}) \implies S_1({\bf X_1})=S_1({\bf X_2})$. This is readily seen to be false, for example take ${\bf X_1}=(1/n,...,1/n)$, and ${\bf X_2}=(1,0,...,0)$.