I have the sample $X_1,\ldots,X_n$ of i.i.d. from $U(\theta - 1/2; \theta +1/2)$. It is well known that $T = (X_{(1)}; X_{(n)})$ is a sufficient but not complete statistic, because $X_{(n)}-X_{(1)} - (n-1)/(n+1)$ is unbiased estimator of zero.
I want to find unbiased estimator of $\theta$ with minimal variance. I have crude unbiased estimator of $\theta$ : $$\hat {\theta} = X_1.$$ From Blackwell-Rao-Kolmogorov theorem estimator $$\hat {\theta}_1 = E\left[X_1\mid X_{(1)} = t_1; X_{(n)} = t_2\right] = \frac{t_1 + t_2}{2}$$ is unbiased and uniformly better than $\hat {\theta} = X_1$.
But because of non-completeless of $T = (X_{(1)}; X_{(n)})$, this is not the only unbiased estimator as the function of $T$. Moreover if I iterate the Blackwell-Rao-Kolmogorov theorem with this new estimator $\hat {\theta}_1 =\frac{t_1 + t_2}{2}$ I don't improve it: $$E\left[\frac {X_{(1)} + X_{(n)}}{2}\mid X_{(1)} = t_1; X_{(n)} = t_2\right] = \frac{t_1 + t_2}{2} = \hat {\theta}_1.$$
Is the $\frac {X_{(1)} + X_{(n)}}{2}$ the optimal unbiased estimator of $\theta$ (but not unique)?
The sample midrange $U=\frac{X_{(1)}+X_{(n)}}{2}$ cannot be termed as the optimal unbiased estimator of $\theta$ in the sense of having minimum variance among all possible unbiased estimators. This is partly because, as you say, the minimal sufficient statistic $(X_{(1)},X_{(n)})$ is not complete, so that a complete statistic does not exist. That there is in fact no uniformly minimum variance unbiased estimator (UMVUE) of $\theta$ for $n>1$ is discussed in this paper by Lehmann and Scheffé; a proof for the $n=1$ case is given here.
However, $U$ is the optimal estimator of $\theta$ in some restricted class of estimators, for example:
It is the best linear unbiased estimator (BLUE) of $\theta$ ('best' in the sense of minimum variance) in the class of all linear unbiased estimators based on $X_{(1)}$ and $X_{(n)}$.
It is the (unique) uniformly minimum risk equivariant estimator (UMREE) of $\theta$ under squared error loss with respect to the location transformation group (also called Pitman estimator).
It is the unique minimax estimator of $\theta$ under squared error loss.
The last two bullets are discussed in Shao's Mathematical Statistics, for example.