Assume that $X_{1}$ and $X_{2}$ are independent, and by definition of $T$, we need to show that
$$ P(X_{1}=x_{1}, X_{2}=x_{2}|T=t) \hspace{0.2in} \mbox{does not depende of $\theta$} $$
I found a answer where the person gives a counter–example, to show that $T$ does depend on $\theta$. But I would like to know if there is a way to show that $T=X_{1}+2X_{2}$ if not sufficient by using the definition, i tried to do:
\begin{equation*} \begin{array}{rclll} P(X_{1}=x_{1}, X_{2}=x_{2}|T=t) & = & \dfrac{P(X_{1}=x_{1}, X_{2}=x_{2},T=t)}{P(T=t)} \\ & = & \dfrac{P(X_{1}=x_{1}, X_{2}=x_{2})}{P(T=t)} \\ & = & \dfrac{\dfrac{e^{-\theta}\theta^{x_{1}}}{x_{1}!}\cdot\dfrac{e^{-\theta}\theta^{x_{2}}}{x_{2}!}}{P(X_{1}+2X_{2}=t)} \\ \end{array} \end{equation*}
Now, i need to find the distribution of $P(X_{1}+2X_{2}=t)$, here i did using convolution,
\begin{equation*} \begin{array}{rclll} P(X_{1}+2X_{2}=t) & = & \displaystyle\sum_{k=0}^{t/2}P(X_{2}=k) \cdot P\left(X_{1}=t-2k\right) \\ & = & \displaystyle\sum_{k=0}^{t/2}\dfrac{e^{-\theta}\theta^{k}}{k!} \cdot \dfrac{e^{-\theta}\theta^{t-2k}}{t-2k!} \\ & = & e^{-2\theta}\displaystyle\sum_{k=0}^{t/2}\dfrac{1}{k!} \cdot \dfrac{\theta^{t-k}}{t-2k!} \\ \end{array} \end{equation*}
so, i used the hint that someone gave in the post, but i am still stuck, i would really appreciate it if someone help me
No, $T = X_1 + 2X_2$ is not sufficient for $\theta$. To see why, note that $$T^* = X_1 + X_2$$ is sufficient for $\theta$. So the sample total does not discard any information about $\theta$ that was present in the original sample $(X_1, X_2)$. This implies that two samples $(x_1, x_2)$ and $(y_1, y_2)$ that generate the same $T$ must also generate the same $T^*$; i.e., $$x_1 + 2x_2 = y_1 + 2y_2 \iff x_1 + x_2 = y_1 + y_2.$$ But this is not true; e.g., $(x_1, x_2) = (4,1)$ and $(y_1, y_2) = (0,3)$ both satisfy $T = 6$, but $x_1 + x_2 = 5$ whereas $y_1 + y_2 = 3$. Because the same value of $T$ could have arisen from two distinct samples with different values for the sufficient statistic $T^*$, the statistic $T$ cannot possibly be sufficient for $\theta$.
To really drive this point home, suppose I drew a sample and told you $T = 6$, and gave you no other information. Could you tell me whether the sample total was $5$, or $3$? You can't.
Since the question asks to compute $$\Pr[X_1 = x_1, X_2 = x_2 \mid T = t]$$ and show this is a function of $\theta$, the simplest approach is to merely show that this is a non-constant function of $\theta$ in a specific case, rather than attempting to get the joint conditional PMF in closed form. That is to say, pick a convenient $T$ and do the calculation for that value. Since the above used $T = 6$, let's keep that. Then the joint distribution of $(X_1, X_2) \mid T = 6$ has support on the set $$(X_1, X_2) \in \{(0,3), (2,2), (4,1), (6,0)\}.$$ As it is convenient to see $X_2 \in \{0, 1, 2, 3\}$ and $X_1 = 6 - 2X_2$, $$\Pr[X_1 = x_1, X_2 = x_2 \mid T = 6] = \frac{\Pr[(X_1,X_2,T) = (6 - 2x_2, x_2, 6)]}{\Pr[T = 6]}.$$ The numerator is simply $$\Pr[(X_1, X_2, T) = (6 - 2x_2, x_2, 6)] = e^{-\theta} \frac{\theta^{6-2x_2}}{(6-2x_2)!} e^{-\theta} \frac{\theta^{x_2}}{x_2!} = e^{-2\theta} \frac{\theta^6 \theta^{-x_2}}{(6-2x_2)! x_2!}.$$ The denominator is $$\Pr[T = 6] = \sum_{x=0}^3 e^{-\theta} \frac{\theta^{6-2x}}{(6-2x)!} e^{-\theta} \frac{\theta^{x}}{x!} = e^{-2\theta} \left(\frac{\theta^3}{6} + \frac{\theta^4}{4} + \frac{\theta^5}{24} + \frac{\theta^6}{720}\right).$$ Therefore $$\Pr[X_1 = x_1, X_2 = x_2 \mid T = 6] = \frac{\theta^{3-x_2}}{(6-2x_2)! x_2! \left(\frac{1}{6} + \frac{\theta}{4} + \frac{\theta^2}{24} + \frac{\theta^3}{720}\right)}.$$ This is not a constant function of $\theta$ and proves $T$ is insufficient, since it suffices to exhibit a single counterexample.
In general, the joint conditional PMF is complicated to write because it depends on whether $T$ is even or odd. In the even case, say $T = 2m$, then $X_2 \in \{0, 1, \ldots, m\}$ and $X_1 = 2(m - X_2)$. In the odd case, say $T = 2m+1$, then $X_2 \in \{0, 1, \ldots, m\}$ as before, but $X_1 = 1 + 2(m - X_2)$. Then explicitly
$$\Pr[T = t] = e^{-2\theta} \sum_{x=0}^{\lfloor t/2 \rfloor} \frac{\theta^{t - x}}{(t-2x)! x!},$$ and $$\Pr[(X_1, X_2, T) = (t - 2x_2, x_2, t)] = e^{-2\theta} \frac{\theta^{t - x_2}}{(t - 2x_2)!x_2!}.$$
It is because the marginal PMF does not admit an elementary closed-form solution for general $T$ that we resorted to computing it for $T = 6$. But it is obvious that it is a nontrivial polynomial in $\theta$ that in general will not cancel with any factors in the joint PMF.