$(X_1,\ldots,X_n)$ is a random sample, $V_n$ is an unbiased estimator of the population parameter $\theta$ and $T_n$ is a sufficient statistic for $\theta$. Then by Rao-Blackwell theorem the rv
$$\varphi(T_n):=\mathbb E_\theta[V_n\mid T_n]$$
is an unbiased estimator of $\theta$ whic is uniformly better than $V_n$.
I know the definitions of conditional expectation and distribution, but I can't get how $\varphi(T_n)$ is an estimator (a function of the random sample that does't depend on $\theta$) using sufficiency
For simplicity, consider the discrete case:
$$\mathbb E_\theta[V_n\mid T_n=t]=\sum_{\lbrace x:T_n(x)=t\rbrace}\mathbb P_\theta(X=x\mid T_n=t)V_n(x)$$
By sufficiency of $T_n$, $\mathbb P_\theta(X=x\mid T_n=t):=g(x,t)$ is independent of $\theta$. So,
$$\mathbb E_\theta[V_n\mid T_n=t]=\sum_{\lbrace x:T_n(x)=t\rbrace}g(x,t)V_n(x)$$
is independent of $\theta$ and is a statistic.