Suppose $x_1$, $x_2$, $x_3$ are independant observations from a Bernoulli-distributed population with parameter $\theta$.
I want to show that $$\hat{\theta} = \frac{x_1 + 2x_2 + x_3}{4}$$ is not a sufficient estimator for $\theta$.
I have derived earlier that $$L(\theta) = (1 - \theta)^3 \left(\frac{\theta}{1 - \theta}\right)^{x_1 + x_2 + x_3}$$ which allowed me to show that $\overline{x}$ was a sufficient estimator for $\theta$. I however am not sure about how to proceed to show that $\hat{\theta}$ is not sufficient for $\theta$ (while it does seem natural to me). I have tried using the "formal" definition for a statistic to be sufficient but with no concrete results.
How would one go about showing $\hat{\theta}$ is sufficient for estimating $\theta$?
For example, you can use definition and prove that there exists some $k_1,k_2,k_3\in\{0,1\}$ and some $s\in\{0,\frac14,\frac24,\frac34,1\}$ s.t. $$ \mathbb P(x_1=k_1,x_2=k_2, x_3=k_3\mid \hat\theta = s) $$ depends on $\theta$.
Say, $$ \mathbb P\left(x_1=1,x_2=0, x_3=1\biggm| \hat\theta = \frac24\right) =\frac{\mathbb P(x_1=1,x_2=0, x_3=1)}{\mathbb P(x_1+2x_2+x_3=2)} $$ $$ =\frac{\mathbb P(x_1=1,x_2=0, x_3=1)}{\mathbb P(x_1=1,x_2=0, x_3=1)+\mathbb P(x_1=0,x_2=1, x_3=0)} $$ $$=\frac{\theta^2(1-\theta)}{\theta^2(1-\theta)+\theta(1-\theta)^2}= \theta. $$ So $\hat\theta$ is not sufficient for $\theta$.