Let $W_1, W_2$ be unbiased estimators of $\theta$ and let $T$ be a sufficient statistic for $\theta$. Is it true that $Var(E(W_1|T)) \leq Var(W_2)$?
I think that it may fails to be true.
If the statement is true then $Var(E(W_1|T))$ is the minimum among the variances $Var(W)$ of all unbiased estimator $W$ so $E(W_1|T)$ is a UMVUE of $\theta$. But Lehmann-Scheffe's theorem tells us that if $T$ is a complete sufficient statistic and $U$ is a unbiased estimator then $E(U|T)$ is a UMVUE of $\theta$. So we don't need the completeness of $T$ if the statement is true.
However I cannot prove it. Anyone can help me?
You can take a look at the proof of Rao-Blackwell theorem, \begin{align} var(E[W_1|T])&=E(E[W_1|T] - \theta)^2\\ &=E(E[W_1 - \theta|T])^2\\ &\le E(E(W_1 - \theta)^2|T])\\ &= E(W_1 - \theta)^2=var(W_1). \end{align} When the inequality can stem from Jensen's inequality or even from Cauchy-Schwarz inequality. Anyway, it tells you only how to improve a given estimator and without the condition of completeness of $T$, it is not necessarily gives you an UMVUE.
Thus, your statement is generally false, however it is not that easy to construct a counterexample as in regular cases a sufficient statistic is also complete and the first iteration of the Rao-Blackwell already gives you the UMVUE.