As stated in the book:
6.36 One advantage of using a minimal sufficient statistic is that unbiased estimators will have smaller variance, as the following exercise will show. Suppose that $T_1$ is sufficient and $T_2$ is minimal sufficient, $U$ is an unbiased estimator of $\theta$, and define $U_1=E(U|T_1)$ and $U_2=E(U|T_2)$.
(a) Show that $U_2=E(U_1|T_2)$,
(b) Now use the conditional variance formula (Theorem 4.4.7) to show that $\mathrm{Var}\,U_2\leq\mathrm{Var}\,U_1$.
Unsure of how to solve (a) I treated its result as a lemma and was able to use the law of total variance to establish the inequality in (b).
Back to part (a), I tried working backwards starting with
$$ E(U_1|T_2)=E(E(U|T_1)|T_2)=?=E(U|T_2)=U_2. $$
The step given by (?) probably is where I need to use the minimal sufficiency of $T_2$ but I am unsure how to proceed.